I would like to implement a UI canvas in world space that is slightly in front of and above my main camera. I have made it a child of my main camera so that it follows.
I want the canvas to appear when the player looks up (with their eyes, not the entire headset) and disappear otherwise. I am using the Meta Quest Pro headset, which already has eye tracking information.
Is there a way for me to pull from this information so that I can incorporate this data into Unity? What would the API for this be? My project has been set up with OpenXR and the XR Interaction toolkit, meaning I am using the XR Rig to set up my VR camera.
I am still new to coding/Unity, so any help would be greatly appreciated!!