What is the best input method in Metal?

First of all, I do not believe that hover effects can be done with gaze input in Metal, is this correct?
What is the best input method for metal?
Unless the hover effect allows the user to see exactly where they are looking, I don’t think gaze input is practical.

Hi there! Unfortunately, it is true. You cannot implement a gaze-based hover effect with the Metal rendering path. You only get gaze information at the moment when the user pinches their fingers, so you can’t use it to cast rays ahead of time to show where the user is looking.

I have found, personally, that as long as there are only a few options and the buttons are really big, that this can work. Check out the sample scene included in com.unity.xr.visionos. It has a relatively detailed UI and interactive objects that I am able to use reasonably well without hover effects.

Without using gaze and pinch, you could try to implement a “laser pointer” input mechanism based on the ARKit skeletal hand tracking data. This might work similarly to the “aim pose” provided on Meta Quest. Otherwise, that’s pretty much all you can do, aside from more creative solutions like speech-to-text or hand gestures. Check out our visionOS samples in com.unity.xr.interaction for more ideas.

I realize it’s challenging to figure out alternatives to the standard input method provided by visionOS. The limitation on hover effects comes from the design of the SpatialEventGesture API provided by Apple. They have cited privacy concerns as to why developers cannot have access to gaze information prior to triggering an input event. Please submit feedback to Apple (using their Feedback Assistant) if you would like them to consider changing this design. There isn’t anything we can do at Unity to expose gaze for hover events.

Is it possible at least possible to have classic hand interactions with UI like on HoloLens 2/MagicLeap?
That is you can touch the UI elements, for example touching a button with the finger triggers it.