Has any one been able to get XR hands Gesture detection to work on device.
It works on Quest, but not can’t seem to get any of the detection to work on Vision Pro.
We did notice that in Open XR project settings for Vision Pro there is no option to enable to the Hand Tracking or Hand Interaction feature group.
For sure, apologies for not including this earlier.
Unity 2022.3.17f1
com.unity.xr.visionos 0.7.1
XR Hands 1.4.0 pre1
Specifically the Gesture detection for hand poses in the Hands package does not appear to work for any poses on device. Even just using the Gesture scene from the samples in that package.
Hey there! We’re still looking into this. If you’re on 0.7.1 you should have the right rotations, and that’s what we were using for our test. So far it seems like some gestures are working like thumbs up, as long as all of your fingers are visible. It seems like on Meta/OpenXR, where a lot of the default gestures were authored, the platform does a fair amount of “guessing” about joint positions, where visionOS reports them as not tracked immediately. We’ll have to come up with a solution to smooth this out in a future update of the hands package. In the meantime, it may be fruitful to try and define your own gestures that work on the platform. You should be able to use the gesture debugger in a device build to test the default gestures and see why they aren’t firing, as well as for defining your own.
Howdy! I ended up finding a workaround that I shared on another thread just now. It turns out that if you just ignore the tracking state from ARKit, there are estimated poses hiding behind the curtain! I haven’t tested the gesture recognition stuff with this yet, but simply modifying VisionOSHandProvider to always report valid tracking for each joint gives me a pretty reasonable pose for each one, even if it is not visible. We’ll be pushing an update soon to take advantage of these poses, but in the meantime you may be able to try this as a temporary fix.