I would like to work with certain ARKit features, especially standard hand tracking.
What would be the best way to simulate this in the editor and in the simulator if possible?
My goal is to create a shader similar to the MRTK standard shader, which reacts to the presence of hands (sort of a highlights the closer you get). Tips?
Currently there is no a good approach for simulating hand tracking in Apple’s VisionOS simulator – To simulate hand tracking on the editor on the other hand I would suggest you to take a look into XRI’s device simulator XR Device Simulator overview | XR Interaction Toolkit | 2.5.2 where you can simulate hand tracking and some poses in a unity scene.
Hope this helps
Thanks! I’ll look into that, once my license is working again…
Interesting. Is there also a way to test the other ARKit features? Our app relies on them, especially placing world anchors at arbitrary locations, not just walls or desks. I would like to know if this actually works on the device.