Near-Far and Ray Interactors don't work on XR Device Simulator and Apple Vision Pro

Hi!

I’m having trouble making interactors to work on Apple Vision Pro, as well as using the XR Device Simulator. Interestingly, everything works fine on Meta Quest 3.

The issue is that the interactors show up but they always point straight-forward and they don’t interact with the UI at all.

I’m using Unity 6000.0.22f1, XR Interaction Toolkit 3.05, XR Hands 1.5.0. And for Vision Pro: version 2.0.4 of packages Apple visionOS Plugin, PolySpatial, PolySpatial visionOS and PolySpatial XR.

Here are the videos demonstrating the problem:

I’d appreciate any help with this. Thanks!

Hey there! Sorry to hear you’re having trouble.

Can you share a repro project that demonstrates the issue? This probably comes down to how the input mappings are set up, but it’s not clear from your videos exactly what’s going wrong. I’m surprised to see the interactors moving at all on vision pro, since the VisionOSSpatialPointer input device doesn’t get any events until you actually pinch your fingers. I’m guessing maybe you are using XR Hands joint poses to drive it? If so, you’ll need to create your own pinch detection to drive interactions.

I’m not actually up to speed yet on the latest ways to do this in XRI 3.0, but I’ve forwarded this along to the team to get some advice. Again, it will help for us to see what you already have set up to make the best recommendation for how to solve this issue. If you aren’t able to share your whole project, please create a small repro project or explain in detail how to set up the rig you’re working with.

Thanks for reaching out. I’m confident we can get this sorted. :slight_smile:

1 Like

I suppose I should also ask: Have you checked out the visionOS XR Plugin package samples? The sample scene includes a GrabInteractor setup that may help point you in the right direction. It doesn’t include an “aim ray” so there’s no feedback on what you are going to interact with, but you could probably hook that up to the XR Hands data. However, it still relies on the gaze-and-pinch model of interacting, where it seems like you’re looking for a hand-pose-based aim ray. Anyway, it should provide a working baseline for XRI setup that you can use to validate your setup.

1 Like

Thank you very much for replying! I’ll be unusually busy for the next couple of days but I will get back to the issue at hand late next week and will respond properly :wink:

Hey I just found your post.
The out of the box xri rig isnt setup to work on vision os unfortunately, due to differences in input and visionOS not supporting openxr.

To get around this, I created a visionOS sample in the XRI package, that uses the gesture data directly to handle this. It works in Polyspatial, but not the full VR metal rendering mode. For that that, follow @mtschoen’s guide above.

The device sim unfortunately won’t work with the gesture setup as well.

I’d recommend reading the sample docs to understand how it works, and you might be able to learn how to work around your issues.

1 Like

Beautiful, thank you!

Hi @Imm0rt4l_PL , did you manage to use XR Interactors ?

I am trying to do the same thing but can’t get any result (the Vision XR sample is working but that’s it)

Nope, not yet :slightly_frowning_face:

Thanks. Are you interested in me posting a solution if I find one ?

1 Like

Of course, thanks!

1 Like

Finally, I’ve found some examples that actually work out of the box. And they show how to use the Spatial Pointer for UI interactions. Here they are: GitHub - XRealityZone/visionOS_Workshop_101_Unity

Thanks. I have crafted my own way into this as well, but didn’t have the time to explicit here yet :sweat_smile: