Hello, I am using the Building Blocks provided with the Meta XR Interaction SDK as an easy solution for hand tracking on my Quest 3. In particular, I am using the Experimental “Virtual Hands” (may be experimental but they work a lot better than the standard Hands Building Block).
The basic hand tracking works very well, but I am now trying to implement teleportation and rotation (again, using the Building Block), and am having difficulty with inconsistencies in the pose tracking, particularly for the teleportation pose (finger pistol pointed sideways).
The pose fails to be recognized particularly upon initialization of the project, and also after the hand using the pose has left the view of headset and then returned into view. I can understand why those situations might be tricky, but I am also frequently experiencing the pose dropping after a few teleports, even if my hand stays nearly in the exact same spot, in front of the camera. Additionally, when I am trying to “reactivate” the pose in order to get the teleport interactor to reactivate, it usually takes several attempts despite perfectly making the pose and the pose being clearly visible by the headset.
If this is just an issue with where the technology currently is, I can at least be satisfied that there is nothing more I can do. However if there are settings I can adjust to improve the pose accuracy I am all ears.
By the way, I am in a well lit room and have a fair amount of space around me. Any and all help or directions are so greatly appreciated. Thank you for your patience.