HoloLens2: XR Interaction Toolkit - Hands Interaction Demo only shows controllers

I’m currently developing on Windows for a HoloLens2 application, using:
Unity 2022.3.19f1
OpenXR Plugin 1.9.1
Mixed Reality OpenXR Plugin 1.10.1
XR Hands 1.4.1
XR Interaction Toolkit 3.0.4

When deploying to the HoloLens, the app tracks my hands but only gives me controllers. I don’t have any controllers on the HoloLens and can’t get it to show hands. Poking still kind of works with controller visuals, but the Ray Interactor expects a button press from the controller, and hand gestures don’t work. Is there a way to force only hands since I’ll never use controllers on the HoloLens?

Thanks!

1 Like

Can you send some more details about your project settings. Specifically, XR Plug-in Management > OpenXR > UWP > Interaction Profiles and the enabled feature sets. For HoloLens, you should have the Microsoft Hand Interaction Profile active as well as the Hand Tracking checkbox for the OpenXR Feature.

We did introduce some new hand/controller switching with the Hand Modality Manager in XRI 3.0.4, so if this previously worked (3.0.3 for example), you may have encountered an edge case.

Thank you for your reply!

I currently have the following enabled:
Microsoft Hand Interaction Profile
Hand Interaction Profile (added because of: Hand rays in the Hand Interaction Demo not working correctly)

Currently have the following check boxes active:
Hand Tracking
Hand Interaction Poses
Hand Tracking Subsystem

Hello @dklee7, did you fix this issue pls?