Does XRI 3.0 Support Grabbable Objects in Mixed Reality (AR)?

We’ve set up the XRI 3.0 Hand Interaction Demo Scene and imported PolySpatial and Apple Vision OS plugins.

The Direct Poke and Ray works in the Hands Interaction Demo. Awesome!
However, grabbable objects (on the right) only highlight with gaze or hand intersection. These are not grabbable with hands when in Mixed Reality (AR).

To implement grab, is the best approach to set up custom interactions for grab using Near-Far Interactor and multi-casting in XRI 3.0 instead of XR Direct Interactor and the XR Ray Interactor? Or is the best practice something else?

XRI has a wonderful array of customizations that we’d like to take advantage of in Mixed Reality. Thank you!

related docs we found:

1 Like

Grab didn’t work with the version 2.5 either.


Right now our support for XRI is limited to only using the Spatial Tap Gesture that we enable with a unique Interactor Type (XR Touch Space Interactor).

This means existing XRI demo content isn’t fully supported yet. It’s something the team is working on.

For just grabbing things the Touch Space Interactor should work fine (see the visionOS template unbounded scene for how to set it up) but you’ll need to add it to any existing XRI sample scene.

1 Like

Appreciate the fast response @DanMillerU3D. We’ve been looking into alternative hand interaction that give a weighty feel. One option is the gaze and pinch example using “pieceSelectionBehavior.cs” in the Samples/PolySpatial/MixedReality feels familiar to visionOS. (great sample by the way!) However, the gaze and pinch appears to turn off Rigid body parameters like mass and axis constraints when pinched. This means to gaze and pinch a car (heavy item) moves the same as to gaze pinch a piece of paper (light item).

Doing more research on ways to make things feel more grabable. Appreciate the hard work everyone at Unity is putting in to give us tools to bring our ideas to Vision Pro!