Hello,
I’m porting a VR app to VisionOS and for now everything is fine. I ported the “old” laser ray but it’s pretty unusable with hands. So I want to use eye tracking + pinch to select and click on UI elements.
I don’t use XRI and don’t want to use it. I’m looking for the low level way please.
I noticed that the input system has events for that, but how to get where eyes are looking? Is 3D touch usable in VR?