UI Interaction issue with Hand tracking on Quest II

I’m trying to test the UI interaction on the Quest II with hand tracking and ray casting.
I’ve set up a very basic scene with all the default assets (OVRCameraRig, UIHelper) and added just a button to test the UI interaction.
This is what my scene looks like:

The issue I have is, when I run the scene, the Ray is rotated 90 degress, and it’s attached to the wrist for some reason. I made a video here to show what I mean :

It’s still interacting with the UI though.
Then after watching some online tutorials, I commented out these lines in the HandInputSelector.cs, which was attached to the UIHelper:

void SetActiveController(OVRInput.Controller c)
    {
        /*
        Transform t;
        if(c == OVRInput.Controller.LTouch)
        {
            t = m_CameraRig.leftHandAnchor;
        }
        else
        {
            t = m_CameraRig.rightHandAnchor;
        }
        m_InputModule.rayTransform = t;
      */
    }

and instead added a 2nd script to the UI helper, with these lines only:

public OVRHand hand;
    public OVRInputModule inputModule;

    private void Start()
    {
        inputModule.rayTransform = hand.PointerPose;
    }

Now the ray is at least attached to the correct position, but it still doesn’t rotate properly with the hand movement. I made another video of it here :

My Unity version is 2021.3.1f1
Can someone please tell me what I’m doing wrong?

Thank you.

Anyone?
I’d appreciate any hint to point me in the right direction.

I’m having the same issue only mine is rotated up. If I switch it from OpenXR to Oculus it points in the right direction. Anyone have an ideas how to fix this?

1 Like

I’m beginning to think if this is just a bug for this version of Unity and oculus SDK? like no one else has these issues?