Hand Tracking not working when build !

Hello everyone,
I’m using XR Interaction toolkit hand tracking feature in my project, I noticed that hand tracking not working when i build the project. I read somewhere that i need to change the OpenXR backend type to legacy as a solution, but i can’t manage to find where to change this value. So where can I find it ? Or if there’s any other alternatives for making hand tracking work on build I’ll accept it.
Cordially.

I’m not sure if you are encountering similar things as me, but here’s the post I made.

Thank you @jimmyd14_1 but i don’t think this is the same problem, the main issue that i’m facing is when i build my project and run using oculus quest 2, hand tracking is not working, this happened twice when i was using oculus sdk for hand tracking and now with the new xr interaction toolkit hand tracking. In both cases hand tracking is not working when i build (i’m building for windows platform), i saw in a thread somewhere that i need to change openxr backend type to legacy but i didn’t manage to find where to change that value.

Hand Tracking on PCVR with Oculus only works if you are in dev mode on Oculus software. Then it also works in a build.

thank you for replying @Qleenie , but it doesn’t seem to be working in a build. I tried several ways to solve this and i built it from different unity versions and different PC’s but i didn’t work even in a development build.

So first, you need to make sure you enabled the hand subsystem and aim extension on the android build target.

If you want to test in the editor then you need to enable development mode on your oculus account and enable the OpenXR dev features on the pc app.

I have an example project with everything configured that should help.

2 Likes

Thank you @ericprovencher , i will try and compare between the two project and tell you if it solves the problem. But i’m sure that hand subsystem is enabled on the android build.

1 Like

Did you enable hands in your Quest settings on the device?

@ericprovencher yes and it’s working in editor but not when i build it for PC.

Oh that’s normal. Hand tracking only works in the editor and in android builds. Not standalone builds.

Seems clear now but can I know why or if it will be available in the future? because I was using Oculus SDK hand tracking when I faced this issue, now i’m using XR Interaction Toolkit and the problem persists.

This is an issue to direct to the oculus forums. They support hand tracking in the editor to enable faster iteration time, but it’s not been a priority for them to support it in PC apps.

Funny enough though if you compile your app for OpenXR on PC, you should be able to emulate hands with the OpenXR toolkit. It modifies the runtime to surface features that engine integrations don’t surface normally.

Thank you @ericprovencher , I managed to make it work. I think the problem was on the oculus desktop app in which some features are not enabled. Now it works fine thanks a lot.

1 Like

Is Standalone hand tracking something that is going to be implemented in the future.

Sorry for not giving the last update on this thread, @bmdenny I solved it as eric said, I set up the developer options again as in this screenshot and everything works fine. Enabling Developer Runtime Features seems solved it. Hope this helps.

2 Likes

I’m facing the same problem using Quest 3. But the desktop app shows only the first 3 settings till “Demo Mode”. Is there another way to activate “Developer Runtime Features”?

found the solution: a developer account is needed

1 Like

I’m running into the same problem as everyone else. Oculus hand tracking no longer works in the Unity Editor, which makes development of hand-tracking apps basically impossible.

On the Quest settinggs, I’ve got “Hand and body tracking” enabled as well as Auto Switch from Controllers to Hands. Hand tracking works on the Quest, and I can put down the controllers and interact using my hands.

In the Oculus app on the PC side I’m using the Public Test Channel, and I’ve enabled “Developer runtime features”.

In Unity, my build target is Android. I’m using the latest available packages, as shown in the screenshot below. I have the Oculus OpenXR runtime selected, and I’ve got the Hand Tracking Subsystem and Meta Hand Tracking Aim feature groups enabled, also shown in the screenshot below.

(the Android tab shows the same, and I’ve also got Meta Quest Support enabled there)

The hands are not being tracked. I do see the two menu icons following my hands, but I suspect those are being drawn by the headset. However, they do demonstrate that hand tracking is enabled on the Quest.

I’ve opened up the HandVisualizer script, and added some debugging statements. The OnTrackingAcquired method is never called, and in the OnUpdatedHands method (which is indeed called) the subsystem.leftHand.isTracked and subsystem.rightHand.isTracked values are always false.

Have I missed anything, or is this a bug in Unity or on the Oculus end?

When you’re working in the editor the settings that are used for OpenXR are in the standalone build target. Be sure to enable the hand subsystem and meta aim extensions there as well.

When you say standalone build target, do you mean the settings shown in the Android tab of OpenXR? If so, I have the feature groups enabled there as well. See screenshot below.