Unity MARS Companion app - Open Beta announcement

Unity MARS Companion app for mobile is now available in beta

The Unity MARS Companion app is available in beta for iOS and Android devices, and we’re looking for beta testers. Be one of the first to try out the new mobile app before it’s released.

What is the Unity MARS Companion app?

The Unity MARS Companion app is the newest component of Unity MARS and is now available on iOS and Android devices.

With the app, you can capture real-world data directly on your device and bring it into the Unity Editor to quickly create and iterate on your AR experience to significantly decrease iteration time and deliver an AR experience that accurately runs in its target environment.

The Unity MARS Companion gives users the power to perform two primary tasks: data capture and in-situ authoring.

  • Data capture: Using the companion app, you can capture room scans, take pictures, and record video with AR data. Once saved to the cloud, this data will sync directly to the Unity Editor, where you can open it using the Unity MARS authoring environment to create a simulated environment that mirrors where your AR experience is intended to run. Now, you’ll have a more true to reality development environment to author your AR content against.

  • In-situ authoring: The companion app also has authoring functionality, that allows you to create content and layout assets directly on your device. And, for example, if a bug occurs because of lighting conditions or a particular room setup, it’s useful to have the actual device data to reproduce and fix the issue.

Both of these tasks involve sending and receiving data to and from the Editor, which is done via cloud storage. This means that users all over the world can work on the same project together, and that the data persists between sessions and is shared between users.

The Unity MARS Companion app is available to Unity MARS users at no additional cost to their existing subscription and includes 10GB of cloud storage per seat. Non-Unity MARS users can also take advantage of the app with limited functionality (see below).

Please note, the app is currently in beta. While this means that content captured or edited with the beta may not be compatible between versions during the beta or with the final release, we’re very interested in hearing your feedback to further refine and improve the product so it fits your needs.

What can I do with the Unity MARS Companion app?
Create and Layout Objects in a Scene

Scan surfaces and author or preview proxies.

Create Environment

Map out an environment by placing corners.

Record Data

Record videos, surface data, and camera paths.

Create Markers

Capture a marker and add hotspots.

Unity MARS user vs. Non-Unity MARS user

Based on your subscription type, you might have different levels of functionality within the app. In the chart below, we explain the differences in these functionalities.

*All changes that have not synced to the cloud will be stored locally and can be re-synced once the user authenticates with a Unity MARS entitlement. If the user logs in with a different username/Unity ID, they will be able to make local changes, but will not be able to sync those changes with projects they do not have access to.

**Unity MARS projects can be synced from the cloud via inputting Project keys or scanning a QR code.

Join the beta today and share your experience with us.

To try out the beta, open this link for iOS or Android* on your mobile device. The steps outlined in our documentation will help familiarize you with the workflows.
*By opting to download Android or iOS versions of the Unity MARS Companion App, you agree to our Privacy Statement

To use the Unity MARS Companion app alongside the Unity MARS authoring environment, we recommend you use Unity MARS 1.2, as well as Unity version 2019.3.0f6 or newer.

As you try out the Unity MARS Companion app, we’d really like to hear about your experience. For product support, troubleshooting problems, sharing projects and feedback, and general discussion about the app, comment in this forum thread or share your feedback in this survey.



Installed it. Used it. Love it.
Needs a lot of work.
Right now the planes are waaaaaay too messy to be useful in anything but the most empty environments. But I can see that this tool will become an indispensable part of our tool kit for our MARS based work.


Installed it. Used it. Love it.


the planes are waaaaaay too messy to be useful

Is this on iOS or Android? We are (for now) just exposing all of the planes the platform is providing, but I agree that it can be difficult to pick between planes that are overlapping. We will be working on future improvements to filter out overlapping planes and/or help you pick between them when they overlap. If you have any ideas or suggestions, please let us know!

1 Like

Sorry, should have said. This was on an Android, Galaxy Note 10+ with the depth camera.
I'm sure you've seen the 8-dimensional physics model that is created when you scan an area. Even a very plain one. I scanned an empty apartment room (5x5m, 3m ceiling) and even though it was an empty box (no furniture) the planes it detected were Lovecraftian in their non-Euclidian overlappyness.
I was musing on this at the time, and thought that what we had here was a really interesting use case for machine learning... to ID walls, corners, ceilings and so on. But I assume that's out of scope.

Probably a more helpful suggestion might be to allow the user to mark corners and then base / refine the detected planes from those?

I tried the 2nd function of defining the floorplan. It took a few tries to nurse a result without it freaking out and forgetting the orientation of the entire plan.
One method that worked well was to step close to any corners (convex or concave) and mark the corner from about a meter away. Then to step right back maybe 3 meters away to drag the blue pole along the wall, letting the camera see as much of the wall, floor, ceiling and any other features as possible. Swooping in and then out and back in again seemed to stop the app from loosing tracking halfway through. Felt like dancing too. So that's a bonus.

I've not tried to import the result into Unity yet, but I will. I'm assuming I can place proxies in certain places in Editor, and then if I built a scene from this data it would place the content on the proxies in the right places... at least I hope that's the goal.

Bottom line, this app feels like it's a beta for an app my company will NEED to use daily in our work, and I really hope it evolves. Please keep up the good work.

Probably a more helpful suggestion might be to allow the user to mark corners and then base / refine the detected planes from those?

Interesting suggestion. For the Environment and Data Recording flows, the intent is to let you capture exactly what the device is going to give you, so in a sense the "Lovecraftian overlappyness" is something that we actually want to capture in order to replicate in the Editor. That way, you can make sure your proxies and C# code can handle bad tracking gracefully. One way or another, if the platform is giving you bad results in the companion app, you can expect your users to encounter the same bad results "in the wild" when they run your app.

To be honest, it sounds like your empty apartment is creating a difficult environment for AR Core to give you consistent camera tracking and high-quality surface extraction. Furniture and clutter actually help with camera tracking, because the SLAM algorithm needs fiducial markers (recognizable details) to work properly. Do you get the same results in that space from a basic MARS or AR Foundation app? There could be some bug or performance issue in the MARS Companion App that we can fix to improve the results you are seeing, so it's good to know how other AR apps behave as a baseline. You may get better results on ARKit, if that's available to you. We also plan to improve the floor plan feature by letting you control where the base of the corner post sits on the screen. That should give you a bit more control over where it lands while also keeping more of the room in view.

For the Proxy Scan Flow, we do actually want to sanitize things a bit more so that you can do your work, even if you're not getting ideal results from the platform. However, we don't want to lead you into a situation where you are authoring against data that won't exist later on when your users run the experience. If we let you tag a surface in the Proxy Scan Flow, but there's no way to do that in a build of your app, you'll end up creating experiences that only work in the companion app, which isn't what we want.

With all of that said, you may be working on a location-based experience, or we may one day get more rich semantic information from the platforms (this already exists to some extent today) which means that you can make assumptions about the space beyond the raw data you get from plane scanning. We will be working on ways to leverage this type of information in future versions of this feature. Currently, you can edit the Trait property of a SemanticTagCondition which may match up to some semantic tag for data provided by Synthetic Objects or some future data provider that supports semantic tags.

A simple step toward what you are describing would be to just let you hide planes you don't want to interact with. But even that gets tricky, because we still need to allow your content to match to those planes, otherwise we're letting you "cheat" during authoring in a way that will still be an issue when users run your app. Then it might be confusing if you start editing the Plane Size condition on your proxy and it jumps over to a plane that doesn't exist.

We're actively exploring these kinds of improvements, and the platforms are always improving their algorithms. Stay tuned, and thanks again for the feedback!

Do you get the same results in that space from a basic MARS or AR Foundation app?
We've built a MARS app we're about ready to launch, and yeah... it's got planes all over the place. So the companion app is certainly reflecting the real experience of the customer.
We have seen a lot of items in our app floating on planes that should just not exist. Can I assume that this is something undesirable that you folks are working to eliminate?

Can I assume that this is something undesirable that you folks are working to eliminate?

Yup! PlaneSizeCondition and ElevationCondition are a good start at coming up with "a good plane" but we want to go deeper. For the situation you're describing, we really need a "non-overlapping condition" but that's a trickier problem to solve efficiently on a mobile device. Putting content which you want to prevent from overlapping in a DistanceRelation (part of a Proxy Group) is a good way to prevent overlapping planes from turning into overlapping content. You might also see if Proxy Forces are a good solution to dealing with planes that don't behave as well as you'd hoped.

As always, we want to prioritize problems that our users are encountering in the real world, so thanks for the feedback!

Why it is not installing in android?
I tried too many times, it is showing loading screen

I'm sorry to hear that you're having trouble. What device and Android OS version are you using? Where do you see the loading screen? Is it in Google Play? Is it the "Made with Unity" splash screen?

I think more often the problem has to do with updating the planes. I have no idea if this is happening at the Unity or iOS level, but I've never seen a plane get deleted or update in a subtractive way after it's been initially created. This is often an issue if it detects it wrong initially, or something in the scene has moved. Are Unity's planes getting subtracted or chandged properly when they change on iOS? this is unrelated to this thread I guess but...

ooooh, really good point.
Once a plane exists, it stays there, even when the app should have detected that there's nothing there after all.

Is there any way to extend the companion app? Either practically or somehow 'faking' it -
I'd love to be able to combine this with ARCloud anchors for instance

@mtschoen I'm testing recording today for the first time and receive Upload Failed errors when uploading to cloud. I'm trying with and without video recordings and also WIFI and Cellular.

Other details:
- Unity version 2020.2.2f1 Personal.
- Using the companion app on iOS 14.4, iPhone 11 Pro Max.
- Editor project is linked to Companion project.


Hi there @jasonwaltersxyz ! Sorry you're having trouble. Are you able to upload other types of resources? Is the project that you linked attached to an organization with at least one MARS license? You will not be able to upload to cloud storage if the project is not associated with a MARS entitlement.

In the short term, you may be able to work around this issue by unlinking the project, recording some videos, then re-linking it and trying to upload from the resource view. If you need to create a new project ID, you can always link the existing Companion App project to the new ID.

@mtschoen Thank you for the response! I've have tried your suggestions:

  • Yes my project and MARS license is using the same org and user.
  • When my project is linked (initial link on creation or re-linked later), I'm unable to upload or save any of the AR Capture Modes.
  • When I create a new unlinked project in the Companion app, I can save stuff to local.
  • When I re-link my Editor project with the Companion app project and try to upload either a Data Recording (w/ no video) or a saved Marker, I see a new error: Failed to upload Untitled Marker Please check your network connection. I have tried WIFI and Cellular.

- I'm using the 45-day trial of MARS. Would that cause this?

Thanks for the info. We are investigating the issue and will get back to you.

1 Like

Hi @mtschoen ! I'm having the same issue that @jasonwaltersxyz is experiencing. It says "Upload failed". I have a similar setup, using Personal license, the free 45 days trial and iOS. Any progress on this?
We're doing a hackathon and creating an AR app made with MARS. Someone else on my team was able to upload a very short scan but then when they tried uploading a longer one it said that the upload succeeded (even though the progress bar only said 20%) but then the video is not there, while the point cloud and planes are there. My guess is that the video upload fails but the app doesn't detect that and doesn't try to upload it again.
Any help with this? :)

1 Like

Hi there, @mariachiaramonti !

Sorry that you're having trouble with the app. How long are the videos that you are recording? Do you see a file size in the save dialog? If so, could you share the exact size of the video you are trying to upload? We have seen issues sometimes with extremely large video files. The issue that @jasonwaltersxyz was seeing had to do with our license check, which was resolved on the backend yesterday. If you are able to upload anything at all, then the license check is not the issue.

The app currently doesn't have the ability to resume or retry a failed upload, but something that you could try is to delete the recording in the Companion Resource Manager in the Editor, and re-uploading it from the device. If it fails every time that you try to upload, this sounds like a bug that we need to fix. I think the only workaround is to record shorter videos. Please let us know if this is the case, and we will try to replicate the issue on our end and come up with a solution.

Thanks for your interest in the MARS Companion Apps and for getting in touch about this. Hopefully we can work out a solution for you!

Are there any particular guides about how capturing video data shown on the YouTube clip embedded above is done please? I have recorded some real life environments with the companion and imported them into the editor but then it is unclear from the documentation how the result shown in the video above is accomplished.

Hi there! I think you are talking about viewing a data recording? If you have made a data recording in the app, you can download it in the Editor and view it in the simulation view. You need to change the mode to Recorded and then select the name of the recording in the dropdown. You can read more about this in the documentation.