UI Button click event not working

I am converting my existing Oculus VR app to Vision Pro. After clearing all the platform dependency issues, I can able to run the app in the Vision Pro simulator. But the UI Button click event is not working. Also, PolySpatialHoverEffect is not working.
Can you give us the demo sample app for fully immersive with UI Interaction?

This is only usable in MR apps, as it maps to RealityKit’s HoverEffectComponent. VR (immersive) apps do not use RealityKit/PolySpatial.

Then let me know, what is the script to be used in VR apps to detect the UI button click event?

Hey there! I’m looking at this today. It should just be a matter of updating the action map for your UI Input Module. I could have sworn we had an example for this but it’s only for mixed reality. More to come…

Hey! I was able to get this working relatively easily. We’ll update or com.unity.xr.visionos package samples to show this off, but since we just shipped the last package version for 2023, you’ll have to wait a bit for that. In the meantime, here’s what I did to enable UGUI input for VR builds:

  • Use the Input System UI Input Module (from com.unity.inputsystem) on my EventSystem (there’s a button on the legacy input module to upgrade to this one)
  • Duplicate the Default Input Actions asset from the Input System package into my project’s Assets folder and modify it to add the following extra bindings (screenshot below)
    • Click → <VisionOSSpatialPointerDevice>/isTracked
    • TrackedDevicePosition → <VisionOSSpatialPointerDevice>/primarySpatialPointer/startRayOrigin
    • TrackedDeviceRotation → <VisionOSSpatialPointerDevice>/primarySpatialPointer/startRayRotation
  • Set the Actions Asset property on the input module to this modified actions asset.
  • Add a TrackedDeviceRaycaster to your canvas. Note: this should be the one in com.unity.inputsystem, not TrackedDeviceGraphicRaycaster or TrackedDevicePhysicsRaycaster from com.unity.ugui.

With all that set up, you should be able to click buttons. Click-and-drag doesn’t work under this configuration because the interaction ray (gaze vector) does not update as you move your hand around. I’m going to keep digging to find a way to enable click-and-drag on the latest version, but it might require changes to package code in order to get it to work. Hopefully this is enough to unblock you for now!

Here’s that screenshot of the action map, for reference:

1 Like

It worked for me. Thank you.

Did you ever have any luck with click and drag with gaze? We have some scrollable UI elements and are hoping to avoid having to add buttons to scroll to all of them.

Was considering an approach where we keep using the gaze rays on the RayInteractor, but use another custom input device to convert the spatial pointer position changes into something we can pass into the UI Scroll action. Not sure how well that would work if there is something simpler.

We were able to get the above solution working for regular scroll, although I’m not sure if it’s the smartest way to tackle the problem. It’s also not working on a carousel in our UI, still looking into that.

Has there been an update to PolySpatial now that shows off using InputActions?
I’ve been trying to use Input Actions to make MRTK work with PolySpatial but without luck so far.

An update for devs working on UI Canvas integration in VR: v1.0.3 of the com.unity.xr.visionos package was just released today, which includes updated sample scenes that each contain a canvas UI demo. After you install the package in the Unity PackageManager, click the “Samples” tab on the package’s page, and it will show you the option to install the VR samples. The “Main” scene provides an example of using a UI Canvas with XR Interaction Toolkit (you need to install the XRIT package separately for it to work) and the “InputSystem” scene provides an example of using a UI Canvas with InputSystemUIInputModule. For both scenes, dragging on UI elements now works. Here’s a link to the changelog for the new v1.0.3 release.

even in v1.0.3, I am still struggling.
I just can’t trigger the click event in simulator.

VR (immersive) or MR (bounded/unbounded)? If it’s MR, make sure that you’re using a world space canvas and that you’re using the InputSystemUIInputModel in the EventSystem (if you’re not, the inspector will show an alert message asking you to switch). If that doesn’t fix it, feel free to submit a bug report with a repro case and let us know the incident number (IN-#####).

Same issue here, I use an InputSystemUIInputModel . It works well in Emulator I can click on my UIs but not in build on device.

I see buttons highlight with eye tracking but i can’t click on them :cry: