Simulate mouse click with hand gestures using Oculus Quest 2

Hello everyone, I’m having a problematic situation with a control system I am working on using Hand Tracking. I’ve set up a way to detect hand gestures and a Finite State Machine that can detect when we enter in a state and when we exit. I worked on some pointer that can be controller by orienting your hands toward something and if you do a gesture while pointing at it, you can make stuff happen. So everything is cool here, however the real problem came up when I started to work on a way to make this pointers work with Unity UI. I cast two ray, one for the Physics (so meshes and everything that can be physically interacted with) and one for Graphic (so UI), so I’m able to detect if I’m hovering on something, but for the life of me I can’t figure out how to simulate a mouse click properly when I make a specific gesture. The reason why I need this is because things like sliders doesn’t work if a try to do things like this with ExecuteEvents.Execute:

ExecuteEvents.Execute(target, eventData, ExecuteEvents.submitHandler);

This work great for buttons, but nothing seems to work if I want to make it work with sliders because I suspect that I’m simulating a mouse click either improperly or only partially, skipping a lot of processes that make mouse interact with the UI in the right way. So I considered the idea to just try and use the Input System Unity provide, which I believe can help me simulate a mouse click, but since I’m using my hands and I don’t have a physical controller, I’m not sure how I can create some custom bindings that fire up when my state machine detect a state change. Is it possible to create a Mouse as a virtual device, bind it’s position to the pointer and trigger the mouse click event when I perform something specific (in my case, a hand gesture)? I’m willing to try any other solution as long as it allow me to interact with UI. If you have other question, feel free to ask and I’ll make sure to answer in the best way I can. Thanks in advance for your attention.

Ok, I made some more progress, I figured out how to modify the OVRInputModule in order to accommodate the existence of two pointers. The UI seems to detect hover and unhover of both pointers properly, but for some reasons only the right hand pinch seems to be able to perform clicks. Left hand doesn’t seems to perform any click at all. The weird thing is that if I hover with the left pointer and I pinch with the right hand, the click event seems to be generated and handled. Any idea why the OVRInputModule seems to work only with the right hand?

How did you do this? I’m trying to develop my own UI system and can’t understand what’s going on. Reading through the OVRInputModule made me wanna end myself and yet I didn’t learn anything from it…