Oculus Quest Handtracking and Unity UI

So, I would really like to have a way to interact with Unity UI. The Oculus documentation says to use the OVRHand.cs exposed PointerPosition as the pointer. Raycast and click, thats all I need. I tried making my own HandTrackingRaycaster to use with a Canvas, but it doesnt seem to be working correctly. Does anyone have experience with custom raycasters?

I'm having colliders on every UI interactable, on a separate layer. Then I just raycast to it using the correct layermask. Working fine so far

Hmm, Might have to just go down that route. Its not for a super complicated UI, so I guess it would work. But, you know, it seems a little odd that you should use the physics system for all the different UI Interactables (sliders etc). Thank you

Figured it out. On startup, find the OvrInputModule and set its rayTransform to OvrHand.PointerPose. Also find OvrRaycaster and set its .pointer to be OvrHand.PointerPose. Now you can interact with UI

2 Likes

Please, could you explain how to access to OvrHand.PointerPose from the script OvrHand.cs? It don’t work for me…

1 Like

If you have the answer, I am also interested…

@KevPan @unity_v_goq8ro7aYWyQ I'm not sure what to say. OVRHand.cs has a public transform called PointerPose for me. Thats what I use.

This works for one hand right? This way you can only interact with ui with the hand whose PointerPose you set to raycaster and input module?

Has anyone done it so that you can use both hands, like in the system ui?