Using XR Interaction Toolkit standard UI components?

Even though the documentation says that one should use XR Interaction Toolkit (XRI), it seems that Polyspatial samples use an AVP-specific ways of interacting with UI. Somewhat similar to iOS; with these touches and touch phases

So how can I use XRI with PolySpatial, so that the standard slidesr and buttons from XRI work in polyspatial? Because in the current Polyspatial examples, those are completely different from the slidesr and buttons in XRI!

In XRi, they are 2D Canvas based elements, while in Polyspatial, they are meshes with colliders etc. Completely different!

I want to make an app that is as much device independant as possible, so I’m planning to use XRI and reuse it for a Quest port.

5 Likes

I’m still interested to learn what the right way is to create simple interactions.

In the docs, XRI is mentioned. In the polyspatial samples however, there is a custom script which handles all dragging of objects & pushing of buttons. Why is that? Why isn’t it using XRI? That would also make it easier to port from a shared space app to a fully immersive app.

2 Likes