Hand Tracking Jitter

We’re using the XR hands package (and XRController to some extent) to let the user manipulate things with his hands. On Oculus Quest, even at the default settings, the hands are smooth and the user can do delicate manipulations successfully. But on a VisionPRO, the hands are very jittery, especially with respect to rotation. Our interactions are pretty unuseable on VisionOS at the moment because of the noisiness of hand tracking.

Is this a limitation of the platform? Or is Unity doing something differently on this platform vs. Quest? Do we have any good ways to work around this?

Thanks!

2 Likes

AFAIK, we just pass along the data we receive from Apple’s APIs. Typically, if you need smoother motion, one option is to average the last N frames of input (start with N=2, then try larger values). This will increase smoothness, but add a degree of lag. If, as I suspect, this is not something unique to Unity, you might bring the issue up with Apple via their Feedback Assistant (perhaps requesting some degree of control over the trade-off between jitteriness and lag).

3 Likes

Ya, I’m surprised too. I would think Apple would have better hand tracking than Meta, but Apple’s is both jittery and laggy. I left this feedback for Apple. (I didn’t address the lag because I think that’s just an inherent problem we gotta live with.)…
"
Problem: The hand tracking is jittery.

Context: I’m making a game in Unity for the Vision Pro. Unity says they just pass along hand tracking data from Apple API.

Question: Is this jitteriness expected to improve or should I attempt to smooth out the jitteriness myself in Unity with custom scripts?

I included a video recording. There are issues with the finger tracking too, but for this feedback, I’m just focused on the jitteriness when my hand is still.
"

And here’s a Dropbox link to that video recording I sent them:

Very very very jittery, not sure how to remedy this except getting a higher refresh tracking for hands. I created some follow scripts averaging frames, also trying the xr transform stabilizer from xri toolkit as well, however I am not able to obtain an acceptable result (even with all the lag introduced from those attempts). The hands and thus hand direct interactions are a jittery mess. I’ll leave comments at Apple as well but I wanted to express my frustration here, as a hands only platform this seems to have been overlooked by all parties.

Seems like hands are more stable on native swift? I have seen some devs mentioning this as well as some tests on video.

P.S: Even averaging tons of frames the interactions feel not smooth. I suspect there are two things playing a role here, one that the built in render pipeline is simply not smooth in visionos (still trying to understand and find a solution to this, as this happens always, even in empty sample scenes), the second one the low frequency hand tracking.