Hands Interaction for VR applications

I learn that Unity XRI provided poke 、pinch and grab interactions ,for other hands interactions (like drag, zoom, hand pose,etc),can we do it by XRI or XR hands in vision pro?or could i do something by XR hands in vision pro vr app?

Yes, you can definitely handle more complex hand interactions like dragging, zooming, or specific hand poses in Unity with XR hands, such as in the Vision Pro. Unity’s XRI toolkit covers basics like poke, pinch, and grab, but for other gestures, you’ll likely need to get a bit creative with your coding.

For actions like drag and zoom, you can track the movement of the hands or fingers in 3D space and apply those movements to objects in your VR scene. For example, dragging could be as simple as moving an object along with the hand’s position, and zooming could involve scaling an object based on the distance between two hands or fingers.

For specific hand poses, you’ll need to use the hand tracking features of your VR system to identify those poses and then trigger the desired action in your app.

If Unity’s standard XRI toolkit doesn’t quite fit your needs, you can dive into custom scripting. This way, you can directly use the hand tracking data to create your own interactions, tailoring the experience to what you need for your VR application.

So, while Unity and the Vision Pro give you a good foundation with XR hands, expanding on those capabilities with your own scripts will likely be necessary to achieve the full range of interactions you’re looking for.