{BLOCKER} [IN-65556] Apple Vision Pro Hand Tracking Not Working Correctly

We have found that when using the XRHands (1.3.0) and VisionOS (0.7.1) packages in a project, the position of the hands during movement is almost a full hand’s width off from where the actual hand is located and the rendering of the hand each frame quickly vacillates between different positions (almost like the Update position and Pre-Render position are completely different) drawing ghost hands. On top of that, randomly joint position reporting is completely broken with joints just appearing in air. I have included two videos showing our results from testing a standard Unity 3D Core Template project re-configured for VR with the XRHands and VisionOS packages (no custom code; only Unity packages). Unfortunately, you cannot see the multiple render positions of the hand in the video due to the Apple Vision Pro device not capturing at the frame rate of the game (you need to run on device to see that), but you can see the joints position rendering breaking for the left hand in the second video.

Unity 2022.3.16f1
VisionOS Package 0.7.1
XRHands Package 1.3.0
Apple Vision Pro Device Seed 7
Bug Report IN-65556

XR Hands Test 1
XR Hands Test 2
AVP XR Hands Test Project


Hey there! Happy new year :slight_smile:

Unfortunately, this is an issue on the platform side. If you try this without Unity in the loop, you’ll see the same lag/offset. As far as I can tell, hand tracking is updated at a slower rate than rendering. I was able to do a quick test of this with Apple’s Happy Beam sample project. You can enable debug visuals on RealityKit entities using this debugger menu:

In this case I just used Axes to show axes on the RK entities this sample puts on your finger tips. If you wave your hands around you can see these entities lagging behind just as the joints do in Unity. This isn’t the most satisfying example, so I’ve been meaning to whip up a Swift app that actually visualizes all the joints like our HandVisualizer does. I’ll share that here when I get a chance.

This is a limitation of our HandsVisualizer script. It should be deactivating joint visuals whenever the joint reports that it is not tracking, but it does not. You should be able to check (joint.trackingState & XRHandJointTrackingState.Pose) != 0 to ensure it is tracked. If that condition is not met, de-activate the joint GameObject. I’ll get our sample updated to do this for the next release.

1 Like

Here’s a more definitive example of hand tracking using only Swift/RealityKit. The cubes are in the raw hand joint positions that ARKit provides, the same values that are read by Unity. The OS seems to apply some smoothing to the hand data that an application cannot turn off. You should share your feedback with Apple that this isn’t acceptable for your use case. Unfortunately there’s nothing we can do on the Unity side.

1 Like

@mtschoen Do you happen to have any guesses as to why the render position of hand (or boxes in the native sample) jitter their position back and forth as the hand moves?

I think it’s because they are updating at a lower frame rate than the display. You could try adding Vector3.Lerp to the code updating transform positions, but unfortunately that will add even more lag/delay.

1 Like

Yeah, I experimented with the smoothing in the HandProcessor.cs of the XRHands package Hand Visualizer sample and it does make the lag extremely worse and does not eliminate the jitter at all. The jitter during movement exists on the native side as well so I am a bit on confused on it.

I also attempted to find where the jitter is occurring but cannot seem to locate the difference through log output on the Unity side; I’m not familiar enough with Swift stuff to poke there too much but it seems I should probably get learning. I added logging in the hands update logic of the VisionOS hand provider on the Unity and it is only getting new data from ARKit every third frame and returning false (no new data?) during polling for all other frames. :frowning:

1 Like

Yeah, this is just a limitation of the platform. The hand anchor pose updates slower than frame rate and there’s no way to ask for faster pose updates. To avoid the flickering, you’ll need to do some sort of interpolation. It may be possible to “predict” intermediate poses based on velocity, but I’ve never had much luck with this kind of thing, personally.

1 Like

So I finally got a device and I am testing and the hand tracking is quite jittery, regular hand interactions we have been using for all other platforms are really unusable here. Any ideas or possible solutions we could try?


I have been trying to port the XRI Hands Interaction Demo v1.3.0 to run on the Vision Pro. The sample does not run out of the box but I have been able to get some things running on device.

Has anyone else made any progress porting this sample? My findings are below.


  1. The hands are being rendered in the air about a full body height above where my hands actually are. I assume this is an issue with the XRRig or VolumeCamera positioning but everything else in the scene is rendered in the proper location. When run in the Editor and using the XR Simulator, the controllers are rendered in the proper position.

  2. As mentioned above, the hands are super jittery and the default position of my hands renders the fingers spread apart in a sort of grotesque way. In addition, I sometime see strange joint positions cause fingers to be overly stretched out.

  3. The hand materials are opaque black likely due to a incompatible shader issue.

Modifications Required To Work:

  1. Anchoring a VolumeCharacter to the CameraOffset underneath the XRRig Game Object.

  2. Offsetting the position of the table by 1.5 or so to lift it off the floor and position it approximately where my real hands are.

  3. Converting the scene to the Universal Render Pipeline and converting all materials to work with URP.

  4. Converting all Text components to TMP components.


Here is link to a sample video taken from my port of the XRI Hands Interaction Demo on the Vision Pro. It shows the following problems:

XRI Hands Interaction Demo on VisionOS - Hand Tracking Issues

  1. Floating hands above the position of my actual hands
  2. Jittery positional tracking
  3. Incorrect joint positions causing distorted finger rendering

We fixed the “incorrect joint positions” temporarily by doing what Matt suggested on this post

Hi there! I’d like to make sure we’ve addressed everything in this thread, as it’s been a little while and we’ve had the chance to ship some new packages. The issue with using XRI gestures should be solved in the 1.2.3 version of com.unity.xr.visionos. More info here.

I believe there have been some updates to XRI and the visionOS samples for XRI but I’m not entirely sure on that front. It’s definitely worth updating to the latest versions of things to see if you’re having the same issues.

For the issue of tracking latency/responsiveness, Apple introduced a new API for querying hand tracking with prediction which improves this situation. Unfortunately, it is only available on the visionOS 2.0 beta, and I don’t believe it will be backported, so users will need to be on visionOS 2.0 to take advantage of it. On the Unity side, this also means a minimum requirement of Unity 6. You can read more about visionOS 2 support here. Note that our package release for visionOS 2 support is not yet available, but it uses the new API and I can confirm that tracking is much better. I’ll ping this thread again when the packages are available for users to try in Unity 6 Preview.

Please let us know if you were able to update resolve the issues you’re seeing. If not, since there are a number of different issues described in this thread, please make sure you identify which issue you’re talking about. Since the original post was quite a while ago, you may want to just start a new thread. Thanks!

1 Like