How are we supposed to access hands input in a fully immersive (VR) VisionOS build? It seems like we are required to use the OpenXR XR plugin in order to use the XR Hands package, but aren’t we required to use the VisionOS XR plugin for this platform instead? Are we supposed to use OpenXR instead of VisionOS for VR builds to VisionOS?
The XR Hands package was originally designed for OpenXR, but we have added support for visionOS, too. (We still need to update the XR Hands package docs with this information.)