in the different topics, you talk about XRHands to leverage Vision Pro’s hand tracking and in the " Create immersive Unity apps" video in WWDC2023 the Unity fellow talks about requesting permission to access hand tracking data.
- how should we ask for permission to access hand tracking inside Unity and
- how can we simulate hand tracking inside the Vision Pro simulator ?
In the current set of packages when you use the Hand Subsystem API the app will automatically trigger a permissions request for hand usage at first launch.
In future updates there will be a user defined string that will be shown to the user when requesting hand tracking data. It will be triggered by the same way (when using the xr hand subsystem API).
There is currently no way to simulate ARKit data in the vision pro simulator (hand tracking is a part of the ARKit API).
The ARKit hand tracking data will be accessed and come through in a very similar way as on other platforms that are supported with the XR Hands package. You could start building hand interactions on another platform.
Note: joint layout and joint rotation in ARKit are different than other platforms.
Thanks for the info.
What about detecting pinch gesture on Vision Pro ?
Didn’t see anything in XRHands apart from the MetaSystemGestureDetector which I suppose only works on Quest devices…
There is no gesture detection with ARKit hands so you would need to calculate the pinch yourself (distance between index tip joint and thumb tip joint). The
MixedReality scene in the samples has an example of this.
If you want to use the system/OS pinch gesture you can use the input package with polyspatial world touch that contains an
Interaction Kind property. This property lets you isolate interaction to
[quote=“DanMillerU3D, post:2, topic:286717”]
You could start building hand interactions on another platform.
[/quote]Ref. start building and test on another platform (device?): what framework should be used for that? Unity XR Interaction Toolkit?