How to mix usage of ARFoundation and Meta XR SDKs?

I’m developing a new app that needs to run on Apple Vision Pro, Meta Quest Pro, and Meta Quest 3.

I’ve got the application running successfully on each of the platforms using ARFoundation APIs and the Meta OpenXR SDK.

Now I’m looking to incorporate Quest Pro’s eye and face tracking capabilities. Is it possible to do that without switching over the existing features (head tracking, hand tracking, interactions) from OpenXR to the com.meta.xr.sdk equivalents?

I’m particularly confused because the official Oculus Unity Movement sample repo says “Unity-Movement is a package that uses OpenXR’s tracking layer APIs to expose Meta Quest Pro’s Body Tracking (BT), Eye Tracking (ET), and Face Tracking (FT) capabilities.” which sounds like I should be able to get the tracking data via OpenXR APIs. But it sounds like I can’t, I have to use the com.meta.xr.sdk.interaction APIs?

What’s the right way to integrate Meta-specific SDKs on top of a common ARFoundation core?

Cross-posted to the Meta VR developer forums here https://communityforums.atmeta.com/t5/Unity-VR-Development/How-to-mix-usage-of-ARFoundation-and-Meta-XR-SDKs/m-p/1195892#M24359

Hi @lourd , this is a great question.

The short answer here is that Unity would like to run Quest eye and face tracking data through AR Foundation, but as you know our current Unity OpenXR: Meta package does not yet support face or eye tracking. You can upvote this feature by going to Unity’s XR Road Map, selecting AR Foundation, and scrolling down to Quest Eye Tracking, which is in the Under Consideration section at the time of this writing.

Our product team really does look at these votes and feature requests, so we appreciate any activity you’d like to give us over there. Until this is implemented, you are probably best served by using Meta Interaction SDK API’s for these features.

Thanks for the response @andyb-unity , I really appreciate it! I submitted my vote and information on the road map.

I’m curious, does Unity own development of the “Unity OpenXR: Meta” package, or does Meta?

I’m still unclear if it’s possible to mix usage of AR Foundation and the Meta XR SDKs. Do you know if that’s possible? The most critical aspect of that I have in mind is XR Interaction Toolkit. I will just try and see, but if you know or anyone else knows or has any guidance or tips that would be really appreciated. Building an abstraction on top of XRIT and the Meta XR Interaction SDK is a pretty tall order, it would be really preferred to stick with XRIT and just use Meta XR SDKs for the additional data/functionality that AR Foundation doesn’t have yet.

It’s our package at the end of the day, but we work closely with Meta on product road map and engineering.

Unfortunately you are in fairly uncharted waters. I believe it’s possible, but we don’t have a sample app or anything of that magnitude to offer as a proof of concept. If you get a prototype running with AR Foundation and/or XR Interaction Toolkit, we’d love to hear feedback.

One thing we are doing to make this easier is we’re rolling out support for the nativePtr APIs for various trackables. So if you wanted to take an anchor that you persisted with AR Foundation and then share it with Meta SDK’s implementation of sharing, you could pass the native XrSpace from AR Foundation back down into your own plugin code (or similar code from Meta, not sure what all is in their Unity SDK). These updates will be rolling out over the summer.

Yeah, agreed. Again I think this is possible but haven’t actually seen an app do this in practice.

1 Like