XR Interaction Toolkit and Hand Tracking

Does XR Interaction Toolkit support hand tracking with Oculus Quest (or Magic Leap)? If so, how can I implement? Is there an option in XR Controller script?

I would also like the option to toggle between hand tracking and controller during runtime. This might be a stretch, but would also like to be able to use at least one controller and hand tracking simultaneously too. Thanks in advanced!

2 Likes

Not at the moment, hands are high on our priority list however :slight_smile:

6 Likes

Ayyyyyyyyyyy :smile:

Just wondering, what does this ref refer to?

we have basic hand tracking in the engine, you can retrieve bone data from the feature API on those devices that support it (eg: Magic Leap)

2 Likes

Is there any update for this?

“XR Interaction Toolkit support hand tracking with Oculus Quest”

6 Likes

And about just the normal fingers? Oculus Quest touch controllers have a great support for fingers, but i don’t find any way to put this works on Oculus Quest with XR Toolkit. My hands are static.

The implementation is in there under OculusUsages but I can’t get any of them to work.

I think what we are missing is the equivalent to OculusProjectConfig where you can specify the HandTrackingSupport, for instance I set this to HandsOnly when I build hand experiences with the Oculus Integration, what is the equivalent for the XRToolKit?

I have the code for feature usages where I get the bone info out but it is currently null because the app does not have the proper permissions to use hand tracking.

3 Likes

@Matt_D_work any insight on my question above?

@Matt_D_work I’m chiming in here. The XR Interaction toolkit looks very promising, but I’m actually lacking the feeling, that it is continuously worked on.
Is there a roadmap or any set plan for features like hand tracking or in general XR HMD support (like HoloLens (2) and MagicLeap)?
It would be really nice to have one unified toolset to develop applications for all/most HMDs and it is a charming idea, that it could be Unity’s “own” maintained solution…but for that, it would need some sort of a reliable development lifecycle…

1 Like

Seems like hand tracking would first be offered at a lower level implementation than XRIT.

Right now, perhaps the greater issue is that you can’t use Oculus hand tracking with the XR Management plugin version… only the deprecated older plugin. I think we probably need to see it supported in the XR Management plugin before we can hope that it will find its way into XRIT.

Are there two things with that name? (I just logged a bug against all of OculusUsages being broken in current XRIT, and … there’s nothing in the OculusUsages class about hand-tracking)

The person I was relpying to was talking about the Touch controllers capacitve buttons.

@Matt_D_work bumping this again because it’s been some more months and we still don’t have Quest hand tracking through XR Management. Could you share some plans around this?

8 Likes

Bumping for same reason, setting up XR and want to add hand tracking

3 Likes

Bumping this again, as hand-tracking gets more important the more time passes by. And we still have zero feedback.

3 Likes

Bumping, as well. It would be nice to hear an update on hand tracking in the XR Interaction Toolkit.

2 Likes

When hand tracking comes to XR Interaction Toolkit, I would love to see it also support it through iOS/AR Foundation since iOS 14 now has it.

3 Likes

Anything new on this?

3 Likes