I’ve been experimenting with the pre-release XR Hands package (1.4.0-pre.1), and I’m wondering if the Gestures can be used as bindings for input actions.
I know it’s in pre-release and everything is still in the works, but it would make sense to have gestures integrate into bindings, or at least allow the unity event to trigger an input action, as if it is were a binding.
Alternatively, if there is no way or no plans to integrate XR Hand Gestures into Input Actions (for whatever technical reason), I will just script a parallel custom input system to be able to trigger game actions like grabbing and shooting.
…Unless someone else found a better way of translating hand gestures to input actions that I’m not considering?
4 Likes
Have you found a solution?
Nope. I just made this ridiculous script as a temp work around until Unity (surely) integrates hand gestures with input action bindings somehow.
using UnityEngine;
using UnityEngine.XR.Interaction.Toolkit;
using System.Collections.Generic;
public class CustomGrabber : MonoBehaviour
{
public XRDirectInteractor interactor;
private IXRSelectInteractable _heldObject = null;
// Call this method to grab an interactable object
public void Grab() {
if (interactor == null || _heldObject != null) return;
IXRHoverInteractable interactableToGrab_Hover = interactor.GetOldestInteractableHovered();
IXRSelectInteractable interactableToGrab = (IXRSelectInteractable)interactableToGrab_Hover;
if (interactableToGrab != null) {
_heldObject = interactableToGrab;
interactor.StartManualInteraction(_heldObject);
}
else print("AHHHHHH");
}
// Call this method to release the currently held object
public void Release() {
if (interactor == null || _heldObject == null) return;
print("SUUUUPPP");
interactor.EndManualInteraction();
_heldObject = null;
}
}
I also came up with something similar Seems to work just fine though.
private XRDirectInteractor _directInteractor;
private void Start()
{
_directInteractor = GetComponent<XRDirectInteractor>();
}
public void Grab()
{
if (!_directInteractor.allowSelect)
{
return;
}
if (_directInteractor.hasSelection)
{
return;
}
if (_directInteractor.hasHover)
{
_directInteractor.StartManualInteraction((IXRSelectInteractable)_directInteractor.interactablesHovered[0]);
}
}
public void Release()
{
if (_directInteractor.isPerformingManualInteraction)
{
_directInteractor.EndManualInteraction();
}
}
1 Like
Your code is much better. I asked ChatGPT to write mine so it threw in a bunch of unnecessary stuff. Haha
I’m curious if you have a different problem I have with XRHands. Have you noticed the hand gestures don’t read reliably at all in AVP?
Even with really high tolerances I can’t get it to register simple poses, unless you are very clearly doing it in front of the AVP cameras. It’s so bad I’m not even using the StaticHandGesture script anymore. I just wrote another custom script that reads the finger “FullCurl” values directly to determine when a grab or release event should happen. That said, I wrote that script a few months ago so I’m not sure if the latest XRHands package fixed that.
Hey there! Indeed we did have some issues with gesture recognition on visionOS, but it should be resolved in version 1.2.3 of com.unity.xr.visionos
. More info here.
Please let us know if you were able to update to this version of the package and resolve the issue. Thanks!
2 Likes
This worked for me! The hands are being read very smoothly.
I still don’t fully understand how to transform the new gestures created with the Xr Hands to input action bindings inside PolySpatialInputActions for instance. That would be very handy for the AVP when created new gestures since current there is only one, the pinch.
1 Like
My general question is how to connect the XR Hand Custom Gesture and the New Input System
1 Like
The official best answer to this is to create a custom input device that wraps the state data of your custom gesture and then writes them to the input device on update.
If you’d like an example of making a custom input device like this, you can take a look at the meta aim extension logic that’s a part of the hands package. It uses a custom input device to expose data from that extension to the input system, and polls the state on update.
This has been a regular request and it’s something we’d eventually like to create an example for, but for now, this is the way to go.
That all being said, since you’re using XRI, you can actually bypass the input system altogether by using the input readers, as @guilherme-francisco-glartek has pointed out
2 Likes