Is it possible to use UnityUI Button instead of SpatialUI button ?
I attached XR UI Input Module to EventSystem and Tracked Device Graphic Raycaster to World Canvas Component, although it works on Editor but Simulator.
Am I forgetting any other settings?
Hey there,
you should be able to use Unity GUI for UI, you could check on UI support | Input System | 1.0.2 and make sure to make those left click and tracked positions go through spatial pointer device values by using an ActionMap
with SpatialPointerDevice
bindings for the various input actions.
Hey, I think I followed your instructions, and It is still not working.
Here is a screenshot of my InputSystemUIInputModule and my InputSystemPackageSettings. Is there something else that I am missing related to configuring the ActionMap to work with Canvas UI?
There is a known issue we are working on the causes this to not work in Play Mode. Unity UI DOES work in simulator (and should also work on device) but for now it’s not working in editor.
Has this issue been resolved?
I’m in the same situation, when playing Unbound Volume Scenes in Unity, the UI in Unity (sliders for example) doesn’t work. Fortunately, I found that the slider works in the Vision Pro simulator and on Vision Pro devices.
I’m hoping that Unity’s UI (like sliders) will also work in play mode in the Unity Editor for quick feature experiments. If you have any news, please let me know!
Oops, fixed the problem.
Instead of adding the XRUIInputModule to the Event System, I added the Input System UI Input Module, and then applied DefaultInputActions to the ActionAsset.
Slider UI now works in both Unity’s Play Mode and Vision Pro’s simulator and devices!
I initially put PolyspatialInputActions on the ActionAsset, and that was the problem…!