Implementing XR Multiplayer Co-location on Quest 3 using NGO, AR Anchors and Meta OpenXR

Hi to everyone!

I’d like to discuss with you about the possibility to implement a co-location system for multiplayer mixed reality apps running on multiple Meta Quest 3 devices sharing the same room.
I know the Meta own SDK has some proprietary features for this purpose like Spatial Anchors, but since there is a lot of cloud stuff to set up, I am trying to get ahead with a “custom” implementation sticking to XR Interaction Toolkit, AR Foundation and Meta OpenXR, while managing multiplayer using Unity NGO.

I started from the VR Multiplayer Template to have a solid network base for my app, and now I’m experimenting with strategies to syncronize players positions with their physical positions. To do so, I need that all users shares, host and clients, a point that can be considered equally placed in both physical and virtual space (should be used as a sort of frame of reference center).

The only thing that came to my mind for now to achieve this result, is to access Meta Quest spatial informations, retrieve ARPlanes, and make the host spawn a Networked ARAnchor to the center of a reference plane (for example the floor or the ceiling).
A client that joins the host lobby should compute the distance between its local reference plane center (ex. local floor center) and the ARAnchor position shared by the host. That distance is used to relocate the client XROrigin.

// Example snippet using floor as reference
var localToSharedDistance = arFloorPlaneLocal.center - sharedAnchor.transform.position;
xrOrigin.transform.position += localToSharedDistance;

A more complex version could use more than one single plane for reference. For example, all walls and planes recognized by the ARPlaneManager that are flagged as Floor, WallFace or Ceiling can be used to compute a sort of room centroid that the host can use to place the shared ARAnchor. The client could compute the same centroid using local arPlanes, evaluate the distance between local centroid and shared anchor and then apply that distance to the XROrigin.

Now I’m asking you some feedback, since I’m still trying to make this approach work. Have you got some advice to give? Is this logic right or am I missing something?
I figure that one problem could be the fact that this method assumes that all headsets have registered really similar real space room data (walls, floor and ceiling dimensions). Do you suggest different ways to get a shared frame of reference?
And finally, could this feature be actually implemented using these tools combo, without Meta Spatial Anchors? Or there are limitations that I missed and that does not permit it?

Thanks for reaching! Hope to open a nice discussion.

Shared persistent anchors support is on our roadmap - so look out for them in the future.

1 Like

Thanks for pointing this out! Can we have even an approximate eta?
While the work for official release proceeds, do you have suggestions to workaround the problem and implement this by hand?

We don’t an ETA for shared anchors yet. Right now the focus is launching Unity 6 on October 17 with all its XR Packages, as announced at Unite last week. Shared anchors will not be part of this launch.

So what do you do right now? If you are a C++ developer, you can take the native pointer of the ARAnchor and write your own OpenXR implementation to implement anchor sharing on Meta Quest.

Otherwise, if your app only plans to target Meta Quest, you will likely have a better experience using Meta’s SDK. The strength of Meta’s SDK is that it always implements the latest and greatest features for Meta Quest, like shared anchors. The strength of AR Foundation is that our API’s work across many XR platforms, but your app doesn’t benefit from this strength if you are only targeting one platform.

2 Likes

Hi, I have exactly the same issue and wanted to ask if you made any kind of progress regarding that or what your final decision was? did you change everything to meta sdk ?

Hi! Unfortunately I don’t have any useful news about this topic, in our company we decided to focus on other projects and left this one pending waiting for some future news. Until it would be urgently needed, we prefer to avoid relying on the cloud-service-based architecture of Meta SDK.

Update from us is that there is no update. We’re landing a lot of other work this quarter. We’re beginning some design work on shared anchors but it’s too early to speculate about when this might be shipping.

1 Like

Thanks to all for the reply :slight_smile: I will then build a device specific solution. However, I hope there will be cross-compatible one at some point :slight_smile:

Note that cross-platform shared anchors is currently an impossibility at the platform level. There is simply no way to share a Meta Quest anchor with an Apple Vision Pro device, for instance. What we are working on next year is adding integration for Meta’s newly released API’s that allow you to share anchors with other Meta Quest users.

1 Like

If you are looking to implement cross-platform colocated multiplayer, typically this is homebrewed by using image tracking or some known position to calibrate multiple devices to the shared coordinate system.

1 Like

If you don’t need precise accuracy, a simple solution is to have a spot on the floor (marked perhaps with a piece of tape or something). Place the headset there, and use a button press or other input to set that point as the origin. Repeat for each headset.

        private void ResetCoordinateSystem()
        {
           XROrigin origin = GetComponent<XROrigin>();
            origin.MoveCameraToWorldLocation(Vector3.zero);
            origin.transform.position = new Vector3(origin.transform.position.x, 0, origin.transform.position.z);
            origin.MatchOriginUpCameraForward(Vector3.up, Vector3.forward);
        }

Note that the third line just sets the vertical position of the origin to zero.

Another approach would be to use a simple AR marker as a point of reference.

Hi,

Thanks so much for the solution! Do you have any experiences of how accurate this is? Is it like 1 cm deviation or more? Furthermore, have you tried to do it with multiple points to increase the accuracy?

If you didn’t notice, in Unity OpenXR: Meta 2.2 we now support both colocation discovery and shared anchors so you can build colocated experiences on Meta Quest platforms.

2 Likes