Hello, I am a beginner developer working on creating an XR experimental environment using Unity.
Currently, I am setting up an XR environment for exploration using the Meta Quest Pro. As part of this project, I aim to link the interface to the VR camera so that the interface always follows the camera. The user will navigate through the AR space while interacting with the interface. To measure the user’s walking data, I plan to place invisible virtual tiles in the AR space along the user’s path.
However, I anticipate encountering a few challenges during development.
Here are the key components I am trying to implement:
I want to enable interaction with the interface using hand gestures while simultaneously tracking the user’s gaze. Specifically, I aim to determine whether the user is looking at the interface or at the virtual tiles.
I have two questions regarding this:
- Is it possible to use hand interactions and eye tracking simultaneously?
- Is it feasible to use tagging as a method to track gaze positions? Additionally, if you have other ideas or recommendations for achieving this, I would greatly appreciate your insights.
Thank you so much for your time and help!