Accessing ARKit Raw 3D Body tracking data in AR Foundation

Hi everyone,

I’ve been comparing body tracking accuracy between AR Foundation and ARKit using Swift. I noticed that body tracking with ARKit in Swift tends to be more accurate and responsive than when using AR Foundation.

For my project, I am particularly interested in tracking the positions of the wrists. I was hoping to access the raw ARKit joint positions directly, as it seems like AR Foundation may be fitting this data into an internal model that affects precision.

I found that AR Foundation allows access to raw 2D pose tracking data, as detailed in the documentation. However, I couldn’t find a similar method for accessing raw 3D pose tracking data.

Is there a way to directly access the raw ARKit 3D joint data using AR Foundation, or would I need to explore alternative approaches to achieve this level of accuracy?

Thanks for any insights you might have!

Best regards,
Mentar

Did you look at GetHumanBody? You’ve got all the joints in there.

We don’t do any post-processing of the joint data by the way.

Yup that’s what we are using already, we compared this side by side to data coming from ARKit and they are different. Is there an internal skeletal model that the ARFoundation is trying to fit ARKit data to?

No. I would recommend that you file a bug with your repro project if you are claiming the joints are different. We have run this test ourselves a few times in the past couple years, and your claim does not match our findings.

I think you may be right, it’s hard to do a side-by-side comparison since the data is either going into AR Foundation or getting drawn on the native iOS app but not both.

We did a side-by-side (subjective) comparison with the data from GetHumanBod and ARKit Native. It is indeed ARKit tracking poorly and then passing that data to AR Foundation.

Thanks for looking into it and helping as track down the issue.

1 Like