How to Oculus Quest Hand Tracking PointerPose Pinch

Install
https://developer.oculus.com/downloads/package/unity-integration/31.2
32 is broken from what I can tell

Add OVRCamera prefab > Find section enum that has Controllers, Hands And Controllers, Pick the Hands and controllers one

OVRCamera in scene > cycle down to Righthand anchor + add OVRHandPrefab
In the prefab components change all enums to right hand if youre setting up the right hand
Enable the Mesh Renderer and Skinned Renderer if off. You can also setup up a custom hand mesh but thats another tutorial guide like

SUPER make sure OVRCameraRig transform is reset to 0,0,0,0,0,01,1,1
ALL THINGS breaks otherwise, there is zero way to move it by default for now

Add a UI button to the scene, make its canvas world space and futz with it till you get it scaled down and into the cameras view
roughly width 181.4188 : height 51 : scale 0.02755119
button pos xy 0
World Canvas add > OVRRaycaster (stupid name, should be raycatcher)
Turn off Graphics Raycaster

Add UIHelpers prefab to scene
UIHelpers > HandedInputSelector turn this off, in its script it is forcing the EventSystem to the wrong transform

Patch the OVRHand.cs file
File private GameObject _pointerPoseGO; and make it public GameObject _pointerPoseGO;
Go down to awake and add after the new GameObject
pointerPoseGO.name = $"{nameof(PointerPose)}{HandType}";
the name is not necessary, but super helps in debugging in editor mode

Make a script like this roughly

public class HandPointerLike : MonoBehaviour
{

    public OVRInputModule _OVRInputModule;

    public OVRRaycaster _OVRRaycaster;

    public OVRHand _OVRHand;


    void Start(){

        _OVRInputModule.rayTransform = _OVRHand.PointerPose;
        _OVRRaycaster.pointer = _OVRHand.PointerPose.gameObject;
     
    }

    void Update(){
            _OVRInputModule.rayTransform = _OVRHand.PointerPose;
            _OVRRaycaster.pointer = _OVRHand.PointerPose.gameObject;
    }

}

Link up the _OVRInputModule from > UIHelper > EventSystem
Link up _OVRRaycaster from > The world canvas object you made earlier
Link up _OVRHand from OVRCamera > … OVRHandPrefab
Turn on UIHelper >LaserPointer > Line Render if off

Hit play and select EventSystem, see if its property rayTransform is pointing to the PointerPose object
If not sa~~~~ flip a table and fix something

If it works, build to the device and test, the laser should line out from roughly your palm into your viewable space. and pinching seems to default act as a press, dont know where to turn this off yet when not needed. The button should react with a color

The key of this is the setting of the two lines of code above. Which feel like a total hack, but most of Oculus Integration feels that way already. the examples are messy, old code, there are two forms of Hands and no good naming dissertation of each. getting bones by id single does not seem to be a thing so you have to loop with the enums

anyway its been a year and ZERO tutorials on how to do this just two notes in this forum and oculus’s which is a wasteland. The one example file in the folder is HandsInteractionTrainScene which is a bit more convoluted to sift though

fun

Thank you very much!
I followed your guide and everything works, but I noticed that I can click on a button only pinching with right hand (even if the raycast is on the left hand). Is that normal? I don’t know how to fix that.

Thank you, this helped me today.

I had to do a two more things though:

  1. Delete the EventSystem that gets automatically created with the Canvas.
  2. Under UIHelpers, on LaserPointer game object, enable the LineRenderer component.

So one thing to note if OVRInputModule gives you an error you need to add the namespace: "namespace UnityEngine.EventSystems{ " and close it at the end of your class. Add it before public class HandPointerLike: Monobehaviour.

supernamey923834 It doesn’t work.