Smooth locomotion - move around the virtual environment at a constant rate, using the trackpad or thumbstick of a VR controller
Support for Unity Input System - use the input system to drive VR interactions
Keyboard/mouse simulation - simulate input without having to deploy to device
Improved layout of properties in the Inspector window
New samples, including use of the Universal Render Pipeline
Bug fixes:
Fixed some behaviors not supporting multi-object editing in the Inspector
Fixed PrimaryAxis2D input from mouse not moving the scrollbars on UI as expected. (1278162)
Fixed issue where Bezier Curve did not take into account controller tilt. (1245614)
Fixed issue where a socket’s hover mesh was offset. (1285693)
Fixed issue where disabling parent before XRGrabInteractable child was causing an error in OnSelectCanceling().
Fixed Tracked Device Graphic Raycaster not respecting the Raycast Target property of UGUI Graphic when unchecked. (1221300)
Fixed XR Ray Interactor flooding the console with assertion errors when sphere cast is used. (1259554) (1266781)
Fixed foldouts in the Inspector to expand or collapse when clicking the label, not just the icon. (1259683)
Fixed created objects having a duplicate name of a sibling (1259702)
Fixed created objects not being selected automatically (1259682)
Fixed XRUI Input Module component being duplicated in EventSystem GameObject after creating it from UI Canvas menu option (1218216)
Fixed missing AudioListener on created XR Rig Camera (1241970)
Fixed several issues related to creating objects from the GameObject menu, such as broken undo/redo and proper use of context object.
Fixed issue where GameObjects parented under an XRGrabInteractable did not retain their local position and rotation when drawn as a Socket Interactor Hover Mesh (1256693)
Fixed issue where Interaction callbacks (OnSelectEnter, OnSelectExit, OnHoverEnter, and OnHoverExit) are triggered before interactor and interactable objects are updated (1231662, 1228907, 1231482)
Fixed compilation issue when AR Foundation package is also installed
Fixed the Interactor Line Visual lagging behind the controller (1264748)
Fixed Socket Interactor not creating default hover materials, and backwards usage of the materials (1225734)
Fixed Tint Interactable Visual to allow it to work with objects that have multiple materials
Known issues:
Teleportation is not functional when the Continuous Move Provider sets Gravity Application Mode to Immediately
Teleportation position is overridden by continuous movement when both occur on the same frame
Custom reticles get displayed on objects without a custom reticle (1252565)
Socket Interactor can apply the wrong rotation to an interactable and cause the interactable to skew in scale when the interactable has a parent with a non-uniform scale (1228990)
Socket Interactor does not take the enabled state of the Renderer into account when drawing the hover mesh
Adding an interactable to a parent GameObject of an interactable that is being destroyed will cause the colliders to not be properly associated with the new interactable (1231482)
Controller connection interruptions disable interactors when using the Controller Manager script from the examples project (1241245)
Anchor manipulation in Ray Interactor is inconsistently applied based on the controller class, and deadzone should be configured in the action for Action-based controllers
Layer of Grab Interactable does not inherit layer of Interactor when selected, which can cause Continuous Move locomotion of the rig to be pushed away in the wrong direction when the object overlaps with the Character Controller
In the example VR project, the Interactor Line Visual only appears in the left eye when using Windows Mixed Reality
Roadmap
We now have a public roadmap available for users to see our latest plans, upvote existing feature requests, and/or submit new feature requests. We are currently working towards a public 1.0 release next year (Unity 2021.2). Most of our focus and development efforts now are on bug fixes, UX improvements, and polished documentation & samples. The feature set for public release will primarily reflect what exists today.
Sharing feedback
This forum is the best place to open discussions and ask questions. As mentioned above, please use the public roadmap to submit feature requests. If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via Help > Report a Bug. Include “XR Interaction Toolkit” in the title to help our team triage things appropriately!
When I make an empty new project with this version of XR toolkitinstead of the previous version, then my quest2 doesn’t find its controllers and is showing me a red ray shooting in the forward direction on ground level.
Doing the exact same with XR Interaction Toolkit 0.9.4 and it works flawless , so something wrong with 0.10.0 or should some extra settings be made ?
If you’re using the Action-based version of the behaviors, and you’re referencing an Input Action contained in an asset on the XR Controller to use for position and rotation tracking, you’ll need to ensure the Action is enabled. The XR Controller will only automatically enable Input Actions that are directly defined on the component, and will require you to manage enabling or disabling the externally defined Input Actions. You can add a GameObject to your scene and add the Input Action Manager behavior to it, and then add the Input Action Asset that the Input Actions are defined in to the Action Assets list in the Inspector. That behavior will then enable all the Input Actions in that asset during its own OnEnable.
Quick QUESTION: @mfuad@chris-massie
Do you know if there’s any way to port the navigation from EditorVR …?
The grip based Pull & Scale… like all the other XR creation tools have too (tiltBrush/Quill Medium etc)
Would be so useful!
Let me know if you have any ideas as how to get started
Thanks
I noticed you renamed these methods from …Enter to …Entered. And also some naming convention changes (uppercase to lowercase etc IDE1006). Are these changes following a general Unity rule, or just for this pet project?
I found a potential bug in the ray interactor in Preview 0.10.
Behaviour in Preview 0.9.4:
Interactor has an attach transform from a fingertip
Interactor is disabled on startup from a script (Start() => interactor.enable = false; )
later on, the interactor gets enabled and shows the ray starting at the fingertip
In Preview 0.10, the attach transform is resetted to position and rotation (0, 0, 0) when the interactor is disabled.
A quick dive in the code showed that OnSelectExiting is called when an interactor gets disabled. In this method, attachTransform is set to m_OriginalAttachTransform. However, m_OriginalAttachTransform is set to Vector3.zero and Quaternion.Identity in Awake.
When changing line 481 and 482 in the Awake function of the RayInteractor.cs script to
Hi @freso , yes our focus is on stability and usability. This includes a high priority on bug fixes, polishing our samples and documentation, and UX improvements to ensure the onboarding experience of using this toolkit is optimal.
There’s currently nothing built-in to do this out-of-the-box with XRI. One approach to do something like this would be to create a behavior that would be responsible for converting gestures with the controllers into translation and rotation amounts to either move the XR Rig or the object you are manipulating. You could serialize multiple Input System Actions that would bind to a controller’s position, rotation, and the grip button. The behavior would listen for the Actions that represents the grip button, and when both are pressed, sample and store the position and rotation of each controller, and likely take the average. Then each frame, the behavior would convert the difference between the initial sampled pose with the current pose, and convert that into a desired change of position and rotation. All of that would be achievable with using the Input System package.
Depending on how your scene is structured, you could then use the position and rotation deltas you computed with that behavior into either updating the Transform of the object you want to manipulate, or the XR Rig with it pivoting around the object you are manipulating. If you’re updating the XR Rig, you can write a new behavior that derives from LocomotionProvider to push those changes to the XR Rig through the LocomotionSystem.
You may be able to search online for examples to hopefully help out with this locomotion method. Some refer to this style as “move the environment”.
I created a bug issue (1291475) to get the documentation button fixed.
Some of the renaming done this version, like changing some properties from PascalCase to camelCase, was to follow Unity style conventions for a package of this type. Most of the properties in the package were already following this convention, but a few didn’t match and were updated for consistency.
The other methods you mentioned were actually split into multiple methods. For example, OnSelectEnter was split into OnSelectEntering and OnSelectEntered. This was done to allow the different phases of an interaction state change, like “select”, to be overridden and to change when the public event is invoked for the Interactor and Interactable. Before this change, the public event on the Interactor would be invoked before the Interactable had a chance to process the change, which could lead to undesired behavior. In this new version of the package, both the Interactor and Interactable will finish processing the change before the events are invoked. These public events were also renamed to match the function, for example onSelectEnter to onSelectEntered. This timing of these methods and events is visualized in the documentation, see Extending the XR Interaction Toolkit.
Is throwing grabbable while moving continuously works fine now?
We implemented our own solution for continious motion but the throw was not working properly in the previous version.
“If you’re using the Action-based version of the behaviors, and you’re referencing an Input Action contained in an asset on the XR Controller to use for position and rotation tracking, you’ll need to ensure the Action is enabled. The XR Controller will only automatically enable Input Actions that are directly defined on the component, and will require you to manage enabling or disabling the externally defined Input Actions. You can add a GameObject to your scene and add the Input Action Manager behavior to it, and then add the Input Action Asset that the Input Actions are defined in to the Action Assets list in the Inspector. That behavior will then enable all the Input Actions in that asset during its own OnEnable.”
No clue what any of this means, once again but imagine I am clueless, which I am…
In the new input system package, when you create new action maps, they are disabled by default, meaning the inputs will not work at runtime.
If you are using the new XR Controllers that take advantage of the Action Maps, you will need to manually make sure the Action Maps are enabled to ensure input works at runtime.
There is a component already made that will enable all action maps during runtime on start called ActionManager.
tldr: put the ActionManager component somewhere in your scene referencing your input S.O. if you are using the new input system with XR
So I’ve created a new scene in which I add an action-based XR Rig. I add the Input Action Manager and connect the XRI Default Input Actions. And the controllers work fine, and they are able to interact with a canvas, but the camera is only updating the position, not the rotation.
I’ve tried changing the Rotation Action on Tracked Pose Driver (New Input System) from the default centerEyeRotation [XR HMD] to centerEyeRotation [Oculus Headset] and it works, but I’m developing for Steam and can’t have only Oculus work. Am I doing something wrong or is there something I’m missing?