Hi,
I’m aiming to build a Polyspatial project with the same features I had in a Meta Quest old project. Starting from the VisionOS template, I understood how the selection manager works and how to modify it accordingly to my needs (like locking model rotation on certain axis)
Regarding two hand gestures, there is a way to use the same selection manager in order to implement the same functionalities of the XR Grab Interactable + XR Two Hand transformer of the XR Interaction toolkit?
Or maybe can we use these XR Toolkit components out of the box?
Edit: I read and tried with the XR Grab Interactable and it is working with one hand gesture (while not rotating the object but this is another issue that other people are discussing about in other threads)
However, it isn’t working with two hands gestures (selection mode multiple, XR general transformer etc.)