Unity OpenXR: Meta 2.0

What’s new in Unity OpenXR: Meta 2.0

Great news! The OpenXR: Meta provider for AR Foundation has been updated to Version 2.0, bringing exciting new features to your Meta Quest devices!

Feature Highlights

  • Persistent Anchors are now supported on Meta Quest. Now you can save anchors during a session and reload them during a subsequent session
  • The Bounding Box provider on Meta Quest allows you to detect and track 3D bounding boxes around real world objects in a physical environment
  • The Meshing provider for Meta Quest allows you to create a precise geometrical representation of the real world for your app

Change Highlights

  • The plane and bounding box subsystems no longer attempt to request the Android system permission com.oculus.permission.USE_SCENE on your app’s behalf. To use these features with OpenXR Plug-in version 1.11.0 or newer, your app must request this permission. Refer to Spatial Data Permission (Meta Quest Developer Center) to learn how to request this permission
  • Changed the names of the Meta Quest features in the Editor to remove the “AR” prefix
  • Added support for the following plane classifications: Wall Art and Invisible Wall

For a full list of features, changes and fixes, see the OpenXR: Meta 2.0 What’s new page

Requirements

  • Unity 6 Preview or newer
  • AR Foundation 6.0.1 or newer
  • OpenXR Plugin-in 1.10.0 or newer

Persistent Anchors


You can now save, load, and erase anchors on Meta Quest with the OpenXR: Meta Provider and AR Foundation’s XRAnchorSubsystem.

Bounding Boxes

Bounding boxes on Meta Quest require you to first complete Space Setup before any 3D bounding box data can be used. OpenXR: Meta does not dynamically discover bounding boxes at runtime. Instead, this provider queries the device’s Space Setup data and returns all bounding box components that are stored in its Scene Model. Some entities in the Scene Model, such as Tables or Lamps, include bounding boxes, while others do not.

Meshes

Meshing on Meta Quest devices requires that the user first completes Space Setup before any mesh data can be used.

3 Likes

While this is great, I’d really like it if these features could be exposed as general OpenXR features i could interface with, as I don’t particularly want to use AR Foundation for portability reasons.

1 Like

Thanks for the feedback. Can you elaborate on those portability reasons? One of AR Foundation’s benefits is improving portability.

1 Like

I would mainly like to do things at a lower level than AR Foundation’s abstractions, and my primary concern is an existing application that we don’t want to re-engineer the structure of to accommodate AR Foundation’s semantics. This app also launches into VR mode by default, so we don’t want to be running AR/MR related modes all the time.

Is this VR app leveraging any world tracking features? Which features from ARF would you want exposed outside of ARF?

This isn’t about AR Foundation features specifically, just that Unity is wrapping perfectly good standalone OpenXR features to only be used with AR Foundation. I want API access to those features without having to deal with AR Foundation.

Couple notes on this.

OpenXR Features for Meta capabilities that don’t have an expression on AR Foundation do not depend on AR Foundation. So for example, Display Utilities stands alone as its own C# API without any AR Foundation knowledge required. We have more features like this on the way.

(You still need AR Foundation package in your project, but it doesn’t do anything and its assemblies will likely be stripped at build time)

Note also: if your issue with AR Foundation is that you don’t want the MonoBehaviours in your scene (ARCameraManager, etc), you don’t need to use them. Meta OpenXR’s integration is at the subsystem level, and you can manage the subsystems directly yourself if you don’t want to use AR Foundation GameObjects.

ie, XRPlaneSubsystem

At some point though you do hit the real constraints that at Unity our priority is to enable developers to build cross-platform XR API’s with consistent behavior across a wide range of both OpenXR and non-OpenXR platforms. AR Foundation is our chosen vehicle for this work.

If your feedback is that the XRSubsystem API’s make too many assumptions and aren’t transparent enough to the underlying platform logic, I agree with you on this, and we are taking this into consideration in future planning. In Unity 6 we are quite limited by the restriction that we can’t make breaking changes.

Anyway if you have more specific feedback, I’m happy to continue this conversation.