Pre-release
This 1.0.0-pre.2 version of the XR Interaction Toolkit is considered pre-release. Pre-release packages are supported packages in the process of becoming stable and will be available as production-ready by the end of this upcoming 2021 LTS release. Starting in 2021.1 Alpha, Unity is changing the way we publish and show packages in the Package Manager, and is designed to provide clear guidance around a packageâs readiness and expected support level. There will be additional iterations of XRI before we get to the final 1.0.0 release.
Whatâs new
For a full list of changes, refer to the Changelog in our documentation.
Many of the changes and fixes in this version were a direct result of feedback we received from the forum and from reported bugs. Thank you to everyone who took the time to make these issues known to the team and for your feedback!
Notable changes
Added and improved Scripting API documentation and Inspector tooltips.
Changed the signature of all interaction event methods (e.g. OnSelectEntering) to take event data through a class argument rather than being passed the XRBaseInteractable or XRBaseInteractor directly. This was done to allow for additional related data to be provided by the Interaction Manager without requiring users to handle additional methods. This also makes it easier to handle the case when the selection or hover is canceled (due to either the Interactor or Interactable being unregistered as a result of being disabled or destroyed) without needing to duplicate code in an OnSelectCanceling and OnSelectCanceled. See the Changelog for code snippets with instructions for how to upgrade and migrate scripts to use the new signatures of these events and methods. Use the Migrate Events button in the Inspector of Interactor and Interactable objects to move any serialized listeners from the old, deprecated events to the new events.
Opened up the custom Editor classes to allow users to more easily customize the Inspector for derived classes. They now also apply to derived classes, so those users who override methods in behaviors will be able to continue using the customized Inspector rather than falling back to the default.
Fixed XR Ray Interactor from clearing a custom aim direction when initializing. (1291523)
Known issues
Custom reticles get displayed on objects without a custom reticle (1252565)
Socket Interactor can apply the wrong rotation to an interactable and cause the interactable to skew in scale when the interactable has a parent with a non-uniform scale (1228990)
Grab Interactables can cause undesired behavior when using Continuous Move locomotion where the Character Controller can be blocked from moving while holding it, or cause the rig to rapidly move away when the object overlaps with the Character Controller
The end of the XR Interactor Line Visual lags behind and can appear bent when moving the controller fast (1291060)
The Hover To Select property on XR Ray Interactor is not functional (1301630)
Roadmap
Use the public roadmap to see our latest plans, upvote existing feature requests, and/or submit new feature requests. We are currently working towards a public 1.0 release this year for Unity 2021.2 (LTS). Most of our focus and development efforts now are on bug fixes, UX improvements, and polished documentation & samples. The feature set for public release will primarily reflect what exists today.
Sharing feedback
This forum is the best place to open discussions and ask questions. As mentioned above, please use the public roadmap to submit feature requests. If you encounter a bug, please use the Unity Bug Reporter in the Unity Editor, accessible via Help > Report a Bug. Include âXR Interaction Toolkitâ in the title to help our team triage things appropriately!
Iâve installed the XR Interaction Toolkit using the Package Manager.
Then, I downloaded/extracted the XR-Interaction-Toolkit-Examples from GitHub.
Imported the extracted directory into my project.
Now, I have a bunch of errors, including:
XR-Interaction-Toolkit-Examples-master\VR\Assets\Scripts\BubbleGun.cs(44,26): error CS0246: The type or namespace name âDeactivateEventArgsâ could not be found (are you missing a using directive or an assembly reference?)
and
Assets\Samples\XR-Interaction-Toolkit-Examples-master\VR\Assets\Scripts\ComplexCube.cs(41,43): error CS0246: The type or namespace name âSelectExitEventArgsâ could not be found (are you missing a using directive or an assembly reference?)
I would recommend that you open the VR example project directly rather than copying that folder into an existing project as you did since the example will depend on some project settings and layers that may not exist in your project. The Examples on github contains two Unity projects you can open with Unity Hub, the AR folder (a mobile AR example), and the VR folder. Follow the steps in Getting started to open the VR project so you can play around with the examples. If you want to extract some parts of the VR project example to copy into your project, you can follow the steps to create an Asset package and then import it.
As for why you were getting those compilation errors, my guess is that an older version like 0.10.0-preview.7 of XR Interaction Toolkit is installed in Window > Package Manager. Expand the foldout for XR Interaction Toolkit in that window, click See other versions, click 1.0.0-pre.2, and then click Install.
i have tested out the new release with the example projects on Github in the hope that an issue is resolved that is inside the XR Interaction Toolkit since we are using it (0.9 preview).
Interactables behind UIs are problematic. When pointing with the ray interactor (and attached ray visual) to the UI, the visualâs length stops correctly at the UI but any interactable behind the UI still gets events like hover, select, etc. which seems wrong and unintuitive as itâs âbehindâ the UI.
This poses a great annoyance to our users since we switched to XR Interaction, as they are constantly selecting and changing objects behind UIs by just interacting with the UI.
As somebody suggested to check the setup again, and this is a new release, and to rule out any error on our side weâve downloaded the example from Github and checked the WorldInteractionDemo Scene - itâs also a problem there and easily recreated as shown in this video:
Steps to reproduce:
Take a grab or simple interactable object (in Scene: from âComplex Grab Interactionsâ)
Place it behind a UI panel setup for XR (in Scene: put it on the box âUI Interactionâ)
Position yourself such that the UI panel is in front of the interactable (in Scene: go to teleport area in the back)
Point the left controller (with ray interactor) at the UI
Expected result:
The interactable behind the UI should not react as long as the ray is directed at the UI.
Actual result:
The interactable behind the UI turns red when the ray is directed at it even if there is a UI in between (that successfully shortens the line visual but not the âreal raycastâ).
Is there something we can do? How can this be resolved? What weâve also tried
Weâve tried to come up with solutions for that, ranging from ugly to less ugly. Our current strategy is to detect whenever the XR Ray Interactor interacts with a UI. We started to implement our own Ray Interactor, extending XRRayInteractor. Weâve almost got it but are stuck at the problem, that there are just too many functions of XRRayInteractor inaccessible to our class. It would be highly helpful, if e.g. GetCurrentUIRaycastResult was public (as also mentioned by Skinzart in XR Interaction Toolkit 0.10 preview is available (version 0.10 preview)).
I have two objects that can be grabbed with the same configuration, one with a collider box and one with a convex collider.
The object with the collider box works well but when I try to grab the convex collider it grabs but with a strange distance between the hand and the object.
I attach a video with what happens:
The same is true when they are multiple colliders and when they are multiple collider on a child object
Edit : Now I can make it work, using a custom android manifest, but I donât get an event if the keyboard loses focus (Selecting outside the keyboard area doesnât make the keyboard disappear and the focus returns to the application)
Thank you for the very detailed post! I agree, that is not desired behavior, and unfortunately there isnât a great way to resolve that as a user with the current version. I created an item in our issue tracking system to open up some of those methods and properties related to the UI raycasts after reading that post by Skinzart, but we have not made those changes yet. Iâve moved it up in priority.
A workaround until we release a fix for this would involve something like subclassing XRRayInteractor and overriding GetValidTargets to remove Interactables that are behind UI. You would need to call TryGetUIModel and compare positions. However, this would be pretty ugly without having all of the methods available for you to call or override.
If you report this bug with the Unity Bug Reporter, our team can provide a public tracking link and give updates on the status of a fix.
Yes this thread and forum are monitored. Questions and feedback can be posted here, and bugs can be submitted with the Unity Bug Reporter in the Editor.
Iâm not exactly sure whatâs going wrong without looking at the Hierarchy and the model. What could be happening is if the objectâs pivot point is offset from the center of the model, that difference might cause that offset shown in the video. You can use the Pivot/Center button in the toolbar of the Editor to Toggle Tool Handle Position to visualize the difference. Since the Attach Transform is not set, it will use the pivot position and rotation of the Transform itself to determine how it orients with the Interactorâs Attach Transform. You can create a child GameObject and set the Attach Transform on that Interactable to that child to manually adjust where the center should be by moving that Transform.
Thank you very much for this, everything that helps is greatly appreciated!
This is kind of the approach we had, but we hit another wall by not having access to TrackedDeviceModel.implementationData. This is what we tried, closely following the (inaccessible) GetCurrentUIRaycastResult:
public class CustomRayInteractor : XRRayInteractor
{
public override void GetValidTargets(List<XRBaseInteractable> validTargets)
{
base.GetValidTargets(validTargets);
if (TryGetUIModel(out TrackedDeviceModel model))
{
int raycastPointIndex = model.implementationData.lastFrameRaycastResultPositionInLine;
if (raycastPointIndex >= 0)
{
validTargets.Clear();
}
}
}
}
But as said, we seem not to able to access model.implementationData here. I am not sure what you mean by comparing positions, as model.position will return the center of the interactable which can be anywhere but seems unrelated to where the ray hits the UI. Can you elaborate on this?
Weâve tried using the internal bug reporter for this, it crashed while uploading the sample project. We will try that again.
If you add this line to the manifest element in AndroidManifest.xml it should show the Oculus system overlay keyboard when the Input Field receives focus:
The keyboard is only dismissed when clicking the icon that looks like a keyboard with a down arrow under it. I donât know if thereâs a way to configure so the keyboard is dismissed when clicking outside the keyboard, that would be a question you should ask on the Oculus forums.
You should be able to use the Application.isFocused property and/or the Application.focusChanged event to determine when an Oculus overlay is opened, either the system menu or the system keyboard.
I was referring to that since GetCurrentUIRaycastResult is private, you would need to derive if TryGetHitInfo was using the result from Physics hits or from UI. You could do something like this in your overriding GetValidTargets method:
if (TryGetHitInfo(out var hitPosition, out var hitNormal, out _, out _) &&
GetCurrentRaycastHit(out var raycastHit) &&
(hitPosition != raycastHit.point || hitNormal != raycastHit.normal))
{
validTargets.Clear();
}
That would let the custom Ray Interactor not hover or select Interactables behind the UI. Again, this is a bug that we intend on fixing in the package, but this is a temporary workaround you can use in the meantime. Note that this solution still does not solve every issue with 3D and UI raycasts causing undesired behavior, such as still being able to hover and select things in the UI when there is an object in front of the UI.
Itâs intended behavior for multiple Interactables to be hovered by an Interactor at the same time. You can change that by deriving from Ray Interactor and overriding GetValidTargets or CanHover to make it only allow hovering one object at a time. We plan on adding configuration options in the Inspector of Ray Interactor so this can be adjusted without needing to create a custom script.
@chris-massie were you guys ever able to figure out why xr grab interactables jitter so much while using the continuous move provider?
It is very easy to recreate. You can have an XR Rig and add a continuous move provider on it. Letâs say the move speed is 5. If your character is holding an xr grab interactable while moving the grabbable will jitter all over the place.
Iâve brought this up with unity before, but it seems like no one has provided a solution or acknowledged the problem.
The lag in Grab Interactables is due to how the object is moved in Kinematic or Velocity Based modes, and the difference in update frequencies between Update and FixedUpdate. The Movement Type value in the Grab Interactable Inspector controls how the position of the object is updated. With Kinematic (the default), the Rigidbody is moved to a target position during FixedUpdate. With Velocity Based, the Rigidbody is moved by setting its velocity during FixedUpdate. With Instantaneous, the Transform of the object is moved both during Update and right before rendering to the VR device. The Continuous Move Provider updates the rig by either updating the Character Controller or the Transform directly during Update.
By default, Update occurs more frequently than FixedUpdate. Update can be like 90 Hz, whereas the default FixedUpdate is 50 Hz, which is adjustable in Edit > Project Settings > Time and setting Fixed Timestep. This difference is what causes the jitter when you move the controller fast while grabbing something (and even more apparent with a high move speed on the Continuous Move Provider).
Changing the Movement Type to Instantaneous will help to get rid of the jitter. We have plans to solve the jitter caused in a future version of the package by separating the visual component from the physics component of Interactables. That would allow us to move the Rigidbody with the physics timestep, and separately move the visual representation of the object in Update.