We have found that when using the XRHands (1.3.0) and VisionOS (0.7.1) packages in a project, the position of the hands during movement is almost a full hand’s width off from where the actual hand is located and the rendering of the hand each frame quickly vacillates between different positions (almost like the Update position and Pre-Render position are completely different) drawing ghost hands. On top of that, randomly joint position reporting is completely broken with joints just appearing in air. I have included two videos showing our results from testing a standard Unity 3D Core Template project re-configured for VR with the XRHands and VisionOS packages (no custom code; only Unity packages). Unfortunately, you cannot see the multiple render positions of the hand in the video due to the Apple Vision Pro device not capturing at the frame rate of the game (you need to run on device to see that), but you can see the joints position rendering breaking for the left hand in the second video.
VisionOS Package 0.7.1
XRHands Package 1.3.0
Apple Vision Pro Device Seed 7
Bug Report IN-65556
Unfortunately, this is an issue on the platform side. If you try this without Unity in the loop, you’ll see the same lag/offset. As far as I can tell, hand tracking is updated at a slower rate than rendering. I was able to do a quick test of this with Apple’s Happy Beam sample project. You can enable debug visuals on RealityKit entities using this debugger menu:
In this case I just used Axes to show axes on the RK entities this sample puts on your finger tips. If you wave your hands around you can see these entities lagging behind just as the joints do in Unity. This isn’t the most satisfying example, so I’ve been meaning to whip up a Swift app that actually visualizes all the joints like our HandVisualizer does. I’ll share that here when I get a chance.
This is a limitation of our HandsVisualizer script. It should be deactivating joint visuals whenever the joint reports that it is not tracking, but it does not. You should be able to check (joint.trackingState & XRHandJointTrackingState.Pose) != 0 to ensure it is tracked. If that condition is not met, de-activate the joint GameObject. I’ll get our sample updated to do this for the next release.
Here’s a more definitive example of hand tracking using only Swift/RealityKit. The cubes are in the raw hand joint positions that ARKit provides, the same values that are read by Unity. The OS seems to apply some smoothing to the hand data that an application cannot turn off. You should share your feedback with Apple that this isn’t acceptable for your use case. Unfortunately there’s nothing we can do on the Unity side.
I think it’s because they are updating at a lower frame rate than the display. You could try adding Vector3.Lerp to the code updating transform positions, but unfortunately that will add even more lag/delay.
Yeah, I experimented with the smoothing in the HandProcessor.cs of the XRHands package Hand Visualizer sample and it does make the lag extremely worse and does not eliminate the jitter at all. The jitter during movement exists on the native side as well so I am a bit on confused on it.
I also attempted to find where the jitter is occurring but cannot seem to locate the difference through log output on the Unity side; I’m not familiar enough with Swift stuff to poke there too much but it seems I should probably get learning. I added logging in the hands update logic of the VisionOS hand provider on the Unity and it is only getting new data from ARKit every third frame and returning false (no new data?) during polling for all other frames.
Yeah, this is just a limitation of the platform. The hand anchor pose updates slower than frame rate and there’s no way to ask for faster pose updates. To avoid the flickering, you’ll need to do some sort of interpolation. It may be possible to “predict” intermediate poses based on velocity, but I’ve never had much luck with this kind of thing, personally.
So I finally got a device and I am testing and the hand tracking is quite jittery, regular hand interactions we have been using for all other platforms are really unusable here. Any ideas or possible solutions we could try?
I have been trying to port the XRI Hands Interaction Demo v1.3.0 to run on the Vision Pro. The sample does not run out of the box but I have been able to get some things running on device.
Has anyone else made any progress porting this sample? My findings are below.
The hands are being rendered in the air about a full body height above where my hands actually are. I assume this is an issue with the XRRig or VolumeCamera positioning but everything else in the scene is rendered in the proper location. When run in the Editor and using the XR Simulator, the controllers are rendered in the proper position.
As mentioned above, the hands are super jittery and the default position of my hands renders the fingers spread apart in a sort of grotesque way. In addition, I sometime see strange joint positions cause fingers to be overly stretched out.
The hand materials are opaque black likely due to a incompatible shader issue.
Modifications Required To Work:
Anchoring a VolumeCharacter to the CameraOffset underneath the XRRig Game Object.
Offsetting the position of the table by 1.5 or so to lift it off the floor and position it approximately where my real hands are.
Converting the scene to the Universal Render Pipeline and converting all materials to work with URP.