Hi, I am testing out the XR Interaction Toolkit and I am trying to get the Hand Interaction Demo to work correctly.
I am using the XR device simulator from the XR Interaction Toolkit package and the hand ray interactors do not work at all, they just point on the Z axis and are not interacting with anything, is also seems like the Ray Interactor is stuck at coordinates (0, 0, 0).

Trying to debug the issue I tested with both Unity 2022.3.17f1 LTS and 2023.2.15f1, the issue is the same regardless.
I have noticed that both “Left Controller” and “Right Controller” DO rotate, and they work correctly, while both Hands do not rotate, but its the Wrist inside “Left\Righ Hand Interaction Visual” that rotates, while the Hands transforms sit still at the scene’s origin.
Given these facts, I think this might be a problem with how the scene is set up (maybe is has something to do with the face that I am using the XR device simulator, altough the rest of the scene works correctly, controllers and all).
The XR Device Simulator unfortunately currently doesn’t support interaction with hands since it doesn’t support driving the Meta Aim Hand OpenXR feature which is what the Input Actions that drive the pose are using in the demo scene (for example, <MetaAimHand>{RightHand}/devicePosition). You will need to run the scene on a Meta Quest device instead of the simulator.
We are currently working on adding support for the hand interaction profile in both the XR Hands and XR Interaction Toolkit packages, and we will add support in the simulator in a future version of XRI.
@chris-massie What is a possible workaround for the XR Hand ray and XR Interaction toolkit to work on any OpenXR-based device?
The XR Interaction Toolkit added support for the Hand Interaction Profile (OpenXR) in version 3.0.4, so if you add Hand Interaction Profile to the list of Enabled Interaction Profiles in the Edit > Project Settings > XR Plug-in Management > OpenXR settings, the input actions used in the hand rig will bind correctly. This still of course depends on the device to support that profile, which not all do yet. The simulator in XRI also still does not yet support simulating the Meta Aim Hand or Hand Interaction Profile, but that will come.
The XR Hands package is also adding better support for the Hand Interaction Profile starting with 1.5.0-pre.1, but that is still in pre-release at the time of this post.
In the meantime, you can use data you get from the XR Hands package, like using the Pinch shape or even getting joint poses yourself with XRHand.GetJoint (passing XRHandJointID.ThumbTip or XRHandJointID.IndexTip for instance) to feed into the input actions or update Transforms yourself. This is not an easy workaround so it may not be what you are asking for, but it should be possible.
Hello @chris-massie , could you provide more details of how to achieve this ?
I looked through the bindings in XRI Input Action Manager but I can’t seem to find anything about XR hands there.

I’m having the same issue but on Apple Vision Pro. Is there any plan to add an interaction profile for it (or any other solution), so that Near-Far, Ray and other interactors work on that device?
Finally, I’ve found some examples that actually work out of the box. And they show how to use the Spatial Pointer for UI interactions. Here they are: GitHub - XRealityZone/visionOS_Workshop_101_Unity
Hi, I tried using the XR Device Simulator and used a virtual hand to grab and move objects, but I found that the objects don’t move at all. Is this because the package still doesn’t support this functionality?