Hand position/finger position/tracking

I’m having trouble getting the hand/finger position in a build to Vision OS.

The most obvious place this is failing is in the XR Interaction Toolkit hand menu sample, where the TrackedPoseDriver does not match where the hand is.

Attempting to get from the XRHandSubsystem XRHandJointID.Palm also does not appear to do anything.

It may be I’m missing a AVP or Polyspacial specific requirement.

All other things seem to be working fine. I can grab with the usual look/pinch, or press buttons etc. But being able to attach or track a specific position is getting lost. This did appear to work at one stage, but for some reason doesn’t seem to want to now.

Are there any samples where attaching stuff to follow a hand is working? (like the hand menu, although that one is not working of course).

I believe there are examples of hand tracking in both the unbounded PolySpatial samples (e.g., the MixedReality scene) and the Metal samples (both require the XR Hands package, and use the Hand Visualizer).

Thanks. I have used that and will test with it again.

Is there any scripting define or other setting I need to ensure unbound scenes correctly allow hand tracking?

The only ones that I’m aware of are in Project Settings → XR Plug-in Management → Apple visionOS: Initialize Hand Tracking on Startup and Hands Tracking Usage Description (required for the permissions dialog). Is your app asking for the hand tracking permission?

Thanks, the visualiser is working fine, so I think it was my script tracking the palm that was incorrect, and I will fix.

1 Like