Hand Tracking Jitter

We’re using the XR hands package (and XRController to some extent) to let the user manipulate things with his hands. On Oculus Quest, even at the default settings, the hands are smooth and the user can do delicate manipulations successfully. But on a VisionPRO, the hands are very jittery, especially with respect to rotation. Our interactions are pretty unuseable on VisionOS at the moment because of the noisiness of hand tracking.

Is this a limitation of the platform? Or is Unity doing something differently on this platform vs. Quest? Do we have any good ways to work around this?

Thanks!

2 Likes

AFAIK, we just pass along the data we receive from Apple’s APIs. Typically, if you need smoother motion, one option is to average the last N frames of input (start with N=2, then try larger values). This will increase smoothness, but add a degree of lag. If, as I suspect, this is not something unique to Unity, you might bring the issue up with Apple via their Feedback Assistant (perhaps requesting some degree of control over the trade-off between jitteriness and lag).

3 Likes

Ya, I’m surprised too. I would think Apple would have better hand tracking than Meta, but Apple’s is both jittery and laggy. I left this feedback for Apple. (I didn’t address the lag because I think that’s just an inherent problem we gotta live with.)…
"
Problem: The hand tracking is jittery.

Context: I’m making a game in Unity for the Vision Pro. Unity says they just pass along hand tracking data from Apple API.

Question: Is this jitteriness expected to improve or should I attempt to smooth out the jitteriness myself in Unity with custom scripts?

I included a video recording. There are issues with the finger tracking too, but for this feedback, I’m just focused on the jitteriness when my hand is still.
"

And here’s a Dropbox link to that video recording I sent them:

Very very very jittery, not sure how to remedy this except getting a higher refresh tracking for hands. I created some follow scripts averaging frames, also trying the xr transform stabilizer from xri toolkit as well, however I am not able to obtain an acceptable result (even with all the lag introduced from those attempts). The hands and thus hand direct interactions are a jittery mess. I’ll leave comments at Apple as well but I wanted to express my frustration here, as a hands only platform this seems to have been overlooked by all parties.

Seems like hands are more stable on native swift? I have seen some devs mentioning this as well as some tests on video.

P.S: Even averaging tons of frames the interactions feel not smooth. I suspect there are two things playing a role here, one that the built in render pipeline is simply not smooth in visionos (still trying to understand and find a solution to this, as this happens always, even in empty sample scenes), the second one the low frequency hand tracking.

There’s visible jitter in the sample project (if you check the box in TrackedHandManager to show fingertips). The sphere following the fingertip seems to flicker through 3 distinct positions. To me it looks like the poses returned by the hand subsystem cycle through the last 3 poses instead of just returning the latest pose.

2 Likes

I’ll mention, in the latest update of visionOS we can hide the real passthrough hands and I can see the jitter better. Moving my hand fingers (joints) is not bad, it is actually when moving my whole hand/arm around, the translation of the whole hand that is jittery. I am curious why it is this way.

2 Likes

Apple updates hand tracking at 30 FPS. This is slower than the visual frame rate obviously. I suppose they want you to write your own script that interpolates between each hand tracking update if you want it to look smooth. At least, that’s what I did… well, ChatGPT did most of the work. Haha
There is still the issue of hand tracking not being accurate enough to shoot a gun in VR with better than storm trooper accuracy. But I suppose we’ll have to wait for the Apple Vision Pro 2 for hand tracking to be more accurate.

I understand that it only updates at 30 fps but that doesn’t fully explain what I’m seeing. I would expect it to look “choppy”, but instead it seems to be in flickering in 3 places at the same time. In the template visionOS project, the (fully opaque) sphere that follows the fingertip looks something like this as I wave my hand around:
image

2 Likes

I see what you mean. Hmm, not sure why it does that.

Yeah agreed, its jumpy and and looks like that.

Thanks for the discussion, I kind of seeing similar. I don’t have experience with Quest, but I feel the jitter problem with Vision Pro is probably due to the lost of tracking (e.g. TryGetPose probably will return false for those glitchy frames, and back to true when it succeed to track again).

I believe the raw data probably would be similar on Quest, but they probably did some kind of guessing based on some model (or maybe just simple extrapolation based on last velocity and frame time, etc.) such that even when tracking is lost, it still try to predicts the joints location.

But this is just my total guess.

Hi there! I see there continues to be some healthy discussion around this issue. :slight_smile:

I think I’ve shared this elsewhere on this forum, but I encourage folks to try out a sample app I created using Swift and RealityKit to visualize ARKit skeletal hand tracking. Based on what I see in Unity, the jitter/flicker/lag issues all appear identical to what I’m seeing in Swift, without Unity in the loop. This is unfortunate, but it’s not something that we can (or even probably should!) fix in our XR plugin or integration packages like XR Hands. Any smoothing or filtering we apply will inevitably come with trade-offs, and that’s not a decision we can make for all customers in every situation.

There are various techniques that you could try, like using a simple Vector3.Lerp to smooth out position updates, all the way to more advanced approaches like tracking the velocity of each joint and “predicting” their future locations for frames where there is no new tracking data. I would encourage folks to share any solutions you come up with in order to save others from duplicating your efforts. Please also share this feedback with Apple to boost the signal that they need to improve the quality of ARKit hand tracking.

I’m hoping that we’ll see improvements come in future versions of visionOS. It may also be possible for us at Unity to provide a sample for how to smooth out these issues. We have to weigh this type of work against other priorities, but feel free to submit it as an idea on the visionOS roadmap so that other users can vote on it and we can consider it alongside other requested features.

1 Like

Thank you so much @mtschoen !

I think all we want is that Unity XR Hands package to implement various built-in smoothing techniques like lerp or predict, and we developers just need to check/uncheck each box and see which combination gives the best result, haha :grin:

That being said, from the discussions above, it sounds like Quest 3 has an overall better hand tracking. Buy do you know is it really because the raw sensor data from Vision Pro is lower quality compared to Quest 3, or is it because Quest 3 already has some built-in layer to pre-smooth the tracking data? Because I would be a bit surprised if Vision Pro raw data comes worse quality, given the much higher price and way more sensors they have!

Noted… :wink:

As I said, any smoothing would come with tradeoffs, and rather than building it into the XR Hands package (whose responsibility it is to surface raw platform data), I think it would make more sense to build into the visualizer/consumer layer, where you can start to make domain-specific assumptions. For example, driving a skinned “glove” mesh will have much different requirements from a “magic spell” particle effect or something more abstract that tracks a user’s hands.

To be sure, our HandVisualizer and HandTrackingManager scripts aren’t a comprehensive solution. I think there are some folks working on adding these kinds of filters and utilities to XRI, or maybe there already is code available that you could use to help, and I’m just not aware of it. We’ve been discussing the issue internally and I’ll report back when we have more guidance on smoothing.

To be honest, I have no idea why there is such a big difference in quality. All I can do is speculate, everybody else. :upside_down_face:

1 Like

Actually, @mtschoen , sorry to bother, but do you mind sharing a similar simple hand tracking project in Unity?

Because I tried your sample app, and I do notice some differences, especially for ring fingers! So I’m wondering if I’m doing something wrong or maybe there is actually a difference?

@mtschoen , I actually tried the HandVisualizer (again)

And I can confirm there is a difference between the RealityKit app and Unity HandVisualizer project! (especially for the ring fingers as I mentioned earlier)

Let me prepare and post some videos to show the comparisons.

1 Like

Thanks for sharing that app @mtschoen , the hand tracking jitter is horrendous on device, I am very surprised this is what is available by default from Apple. Can you submit this app to the Apple team for review? This should obviously be an issue

Also I want to point out something I noticed even in Unity. If you maintain you arm still and only wiggle your fingers, the tracking is not that awful. The problem happens when translating your whole hand in space

Yeah I’m pretty sure they’ve seen it, and they’re aware of the issue. The more people who submit feedback through their Feedback Assistant (only once per person, please, and be nice :slight_smile:), the stronger a signal they’ll get about this.

2 Likes

I’m using XR Transform Stabilizer from XRIT, and almost eliminate de jittering at the cost of some lag, of course.

2 Likes