I’m having trouble getting the hand/finger position in a build to Vision OS.
The most obvious place this is failing is in the XR Interaction Toolkit hand menu sample, where the TrackedPoseDriver does not match where the hand is.
Attempting to get from the XRHandSubsystem XRHandJointID.Palm also does not appear to do anything.
It may be I’m missing a AVP or Polyspacial specific requirement.
All other things seem to be working fine. I can grab with the usual look/pinch, or press buttons etc. But being able to attach or track a specific position is getting lost. This did appear to work at one stage, but for some reason doesn’t seem to want to now.
Are there any samples where attaching stuff to follow a hand is working? (like the hand menu, although that one is not working of course).
