For testing XR hands in Polyspatial, I’m using Oculus Link on PC. In this case, the only available provider is OpenXR, which is not supported on visionOS. I’m assuming that on the Mac I need to explicitly connect an alternative hand tracking provider. I have an Apple Labs test in a couple of days and would like to ensure I have it set up correctly (even though I can’t test in the sim). Any input appreciated.
Did you find a solution ? If not, I can help you on this. I was able to test hand tracking on device during my Lab session in London today…
Yes please! Be great to get some feedback from your tests.
Ok, so what you need to prepare for hand tracking on VisionOS is directly use the Unity XR Hands package (“com.unity.xr.hands”: “1.3.0”).
There is a post already putting some code to customize the data retrieved by the Unity XR hands package here: Could not find Hand Subsystem - #2 by puddle_mike or you can directly use the sample and the hands assets from the package for a quick test…
As you can’t really test hand tracking on the simulator, you can setup your project with Android (and the editor directly using Quest Link) using Oculus XR plugin + OpenXR plugin to test that hand tracking using Unity XR Hands package works ok, and then when you switch your target to Vision OS, it should work directly.
The only concern that could remain is that you probably will need to add rotations on VisionOS for each bone to have the exact same result as you have on Quest/OpenXR.
I had to add this rotations to the transform.rotation of each bones I got from XRHands:
r*=Quaternion.Euler(0f,-180f,0f); for the left hand
r*=Quaternion.Euler(0f,180f,0f); for the right hand.
Apprently, Unity is aware of this difference of behavior between VisionOS platform and other platforms and will fix it but using the 0.6.2 package, you still need to add those rotations…
Hope this helps !
That certainly does help. Really appreciate the input.
I had done almost all those things, using XR Hands (which tested fine with OpenXR and Link), but omitted the final transform. I should have also put a visible mesh where had a collider on the index fingers for the Lab test and that would have given me a clue as to the transform required.
Great to know that it works out of the box - just requiring the final transform.
Thanks again!
Hey there! Just to close the loop on this one, we did indeed ship a solution for this in our 0.7.1 version a few weeks ago. Joint rotations should now be rotated as described above when accessing them through the existing API. There is also a TryGetVisionOSRotation extension method to get the “raw” rotation defined by the platform, if you need it.
Excellent! Thanks for the update.