I’ve encountered issues with the XR interaction toolkits integration with XR Hands. The XR hands are being tracked fine. However, when I try to interact with an interactable object (e.g. pinch grabbing it), it doesn’t seem to grab the object at all. Poking objects and UI buttons is the only interaction that works. I am using the Varjo XR-3 headsets with the headset tracking method set to Varjo inside-out tracking.
Steps to Reproduce:
- Use XR Interaction Toolkit version 2.5.4 and XR Hands version 1.4.1.
- Open the HandsDemoScene from the Hands Interaction Demo sample in XR Interaction Toolkit
- Attempt to grab the interactable shapes in the scene by pinch grabbing them
Expected Result: The interactable shape is supposed to follow the hand as you grab it
Actual Result: The XR Hand doesn’t register a select interaction. The interactable shape isn’t grabbed.
I tried replicating this issue with XR Interaction toolkit 3.0.4 and XR Hands version 1.4.1 but I still got the same undesireable result. Is there something wrong with the varjo setup I’m using. I see other tutorials with the meta quest and xr hands interaction works just fine for them.
Thank you.
While the XR Hands package at version 1.4.1 supports hand gestures, you will need to wire the gesture event to either an input action or a script the hand the XRI select logic. This is something we are currently working on remedying for XRI 3.1 + XR Hands 1.5.0 with our support for Common Hand Poses.
I looks like the Varjo XR-3 is using Ultraleap under the hood, and I believe they support the OpenXR Hand Interaction Profile, so you can also make sure your OpenXR runtime has the extension running and also add it to the list of Interaction Profiles in Unity Project Settings > XR Plug-in Management > OpenXR. This should allow you to wire up the specific Select input actions. This is also setup to work properfly out of the box in XRI 3.0.4+, so you can see an example of how we have setup the default input actions in the Starter Assets sample package.
Hello @VRDave_Unity, how do you make the extension run using the openXR runtime? I already have OpenXR Hand Interaction profile added to my list of interaction profiles however it still doesn’t register inputs from my hand poses.
Also, I don’t think I can upgrade my XRI toolkit to a higher version than 2.5.4 because I have already gone far into my project with my current XRI toolkit version. Whenever I try to update the XRI toolkit in my project my input system completely crashes the project entirely and the only thing I can do to fix it is by reversing the change back to XRI 2.5.4. I also don’t really understand how the new XRI toolkit works especially when it comes to processing inputs so i find the 2.5.4 version easier. Is the XRI 3.0 necessary for make the OpenXR Hand Interaction Profile for Ultraleap work?