Introducing Meta Quest Support in AR Foundation


Introduction
AR Foundation enables you to create multi-platform augmented reality apps with Unity. In an AR Foundation project, you choose which features to enable by adding the corresponding components to your scene. AR Foundation enables these features using the platform’s native SDK, so you can create once and deploy to multiple platforms (mobile and XR headsets).

We are introducing support for Meta Quest 3, Meta Quest 2 and Meta Quest Pro to AR Foundation through a preview of a new Meta OpenXR Feature package. This package is currently in an experimental state and depends on the Unity AR Foundation package and the Unity OpenXR Plugin Package.

Feedback
We want to hear from you. We’re especially interested in:

  • Is the documentation helpful?
  • Workflows that are unclear?
  • Which features do you want to see supported next?

Please feel free to post your feedback in this thread or in this sub-forum.

Installation
The experimental Meta OpenXR package is currently available in the Unity Package Manager (UPM). Since it’s an experimental package, it will not show up in the UPM search. You will need to add this package by typing in its name directly into UPM.

To download the Meta OpenXR package, open the Unity Package Manager from inside the Unity Editor, click the plus (:heavy_plus_sign:) symbol in the top left, select “Add package by name” and type com.unity.xr.meta-openxr. Once downloaded, it will automatically trigger other required packages, such as the OpenXR Plugin and AR Foundation packages, to download. For sample content, check out Simple AR and Anchors on Github.

Note: AR Foundation on Quest relies on Meta’s Scene feature for plane data. That means you must perform a Scene Capture via Room Setup on your Quest to see planes. See Room Setup instructions for details.

How to report bugs
Ideally we’d like any bugs reported through the built in bug reporter tool, as that will automatically provide us with some relevant context.

7 Likes

Will Apple Vision Pro also be supported through the AR Foundation?

Yes, you can expect features like AR Foundation and the XR Interaction Toolkit to be well integrated in Unity’s support for visionOS.

4 Likes

Does it mean that the AR Foundation (ARKit) app will work on visionOS without any changes?

Plane Detection, Face Tracking, Image Tracking

Bringing an app that uses AR Foundation / ARKit to visionOS will certainly require additional work – input and interactions is just one example area. But let’s keep this thread focused on the topic at hand. We will keep the community updated on visionOS support when we have more to share.

3 Likes

Waitasec, I was under the impression that with Quest 2 and Quest Pro, there are no depth sensors and you can’t obtain the raw camera feed, so you could only use environment data defined by having the user move the motion controller and manually trace objects like desks and chairs in their room. Is the version of Unity AR Foundation that runs on Quest 2/P somehow capable of automatic algorithmic plane detection, without requiring the user to manually map them out??

So I got everything set up using a QPro and 2022.3.3f1 (also tested 2023.1.0b20) using Meta OpenXR 0.1.1 and AR Foundation 5.1.0-pre.6, I also added hand tracking using XR Hands 1.2.1, but for the life of me I cant get AR Plane Manager to visualize planes using the Quest Pro, am I missing something?

Or does it not support AR Plane visualization and detetion like in ARCore?

This is as vanilla as it gets:

2 Likes

You will still need to manually map out planes.

for the life of me I cant get AR Plane Manager to visualize planes using the Quest Pro, am I missing something?

@WayneVenter great question. AR Foundation plane detection on Quest is powered by Meta’s new Scene feature: https://developer.oculus.com/documentation/unity/unity-scene-overview/

We are working on improving our documentation for this, but on most recent versions of Meta Quest software what you need to do is go to Settings > Experimental > Room Setup and select the Set up button. This will take you into Meta’s Scene Capture flow. From there if you add features to your room that include plane components (at time of writing, Desk and Couch reliably produce planes, but Meta is also iterating on their backend and this could change), then AR Foundation will be able to access the pre-stored plane information from your Scene and use that as the backend of plane detection.

To be clear, AR Foundation does not perform any raycasts for real-time plane detection on Quest as you might expect from ARCore or ARKit.

For other Project Setup troubleshooting, I recommend taking a look at our Project Setup docs page: Project setup | Meta OpenXR Feature | 0.1.2. This page currently does not mention Room Setup but we’ll add this in a future release.

1 Like

Awesome, I thought it was something like it that. ie from the Room Setup, but thought you could do the setup from inside the Unity app. I was hoping for the ability to maybe raycast to the floor and then draw my own “room setup” and store those planes/meshes for future persitant use, guess the feature flip/flop between Meta backend and AR Foundation is going to take some time to stabalize.

Now my next big ask, will I eventually be able to remote play from Editor to device like with the Oculus sdk? This will speed up development time.

And lastly, assuming the Quest 3 has a depth sensor, are there plans for an ARCore like SLAM/Raycast hit implementation or maybe answering my own question, it will still rely on Meta’s Scene feature and AR Foundation is a wrapper in effect.

Thanks @KevinXR @mfuad @andyb-unity and team! I’m excited to try this out and it’s perfectly timed for a VR Jam that just started today too. I’ll be giving this a shot as part of that jam. Thank you again and cheers!

Hi, I’m very interested in the Meta OpenXR Plugin, but I have two questions:

  • Is there a size limit for the Scene Model feature? How can I create experiences in huge indoor environments (e.g., museums or palaces)?
  • When do you plan to make image tracking (or QR code scanner) available?

Thank you!

The blog post states this :

But it seems false about the Anchors when I go on the documentation :

So : Does Anchors works on Quest ?

What is the scene model feature ? I don’t see anything on the documentation page about this feature

So you are

So you are saying that Plane Detection on Quest does not actually detect planes ?!
It only takes the planes from the Room Setup done by the user beforehands and give them accessible to the developers ?
What a shame ! (not your fault, only Meta’s fault to not be able to detect meshs)

We’re looking forward to hearing about your experience!

We are actively working on Quest Link support to speed up development iteration time. Stay tuned, it’s coming soon!

We’re going to hold off on discussing what Quest 3 may or may not have and wait for Meta to officially announce more information.

2 Likes

Hey, I looked through Meta’s Scene documentation and I don’t see mention of a Scene Model size limit. To create an experience for a large indoor environment, you’d have to run Room Setup on your Quest in that environment to identify the walls, surfaces, and other planes.

We’re using this preview to help us understand which features people would like to see next. How would you like to use image tracking/QR scanning on Quest? What other features would you like to see next?

1 Like

Anchors are supported! Thanks for catching that. We’ll update the documentation.

Yes, AR Foundation on Quest takes the planes from the Room Setup done before hand.

Hi @KevinXR Here’s a video capture of the Anchor scene. My planes successfully appear in as yellow planes, but I’m unable to create any anchors no matter which buttons I press on my controllers. I did move the UI from screen space to world space so that I could see the logs. I was planning on further tweaking the anchor scene, but wanted to check here first to make sure I wasn’t doing something incorrectly.

9109510--1263223--2023-06-27_16-04-37.gif

@KevinXR I’m poking around and I can’t seem to find any documentation regarding how to save anchors using this package. Saving spatial anchors is fairly clear under the Oculus documentation (https://developer.oculus.com/documentation/unity/unity-spatial-anchors-overview/#save-anchors)
However, I can’t seem to find anything for the Meta OpenXR Feature. Thanks in advance!