Using ARKit meshing with AR Foundation 4.0

The latest preview versions of AR Foundation / ARKit 4.0 provides support for the scene reconstruction feature that became available in ARKit 3.5 and is enabled on the new iPad Pro with LiDAR scanner. It is now available on Unity 2019.3 or later.

This new mesh functionality also requires Xcode 11.4 or later, and it only works on iOS devices with the LiDAR scanner, such as the new iPad Pro.

Using the LiDAR sensor, ARKit scene reconstruction scans the environment to create mesh geometry representing the real world environment. Additionally, ARKit provides an optional classification of each triangle in the scanned mesh. The per-triangle classification identifies the type of surface corresponding to the triangle’s location in the real world.

AR Mesh Manager

To use ARKit meshing with AR Foundation, you need to add the [ARMeshManager](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html) component to your scene.
5809441--614623--arfoundation-mesh-manager.png
Mesh Prefab

You need to set the [meshPrefab](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_meshPrefab) to a prefab that is instantiated for each scanned mesh. The [meshPrefab](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_meshPrefab) must contain at least a [MeshFilter](https://docs.unity3d.com/ScriptReference/MeshFilter.html) component.

If you want to render the scanned meshes, you will need add a [MeshRenderer](https://docs.unity3d.com/ScriptReference/MeshRenderer.html) component and a [Material](https://docs.unity3d.com/ScriptReference/Material.html) component to the [meshPrefab](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_meshPrefab)'s game object.

If you want to have virtual content that interacts physically with the real world scanned meshes, you will need to add [MeshCollider](https://docs.unity3d.com/ScriptReference/MeshCollider.html) component to the [meshPrefab](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_meshPrefab)'s game object.

This image demonstrates a mesh prefab configured with the required [MeshFilter](https://docs.unity3d.com/ScriptReference/MeshFilter.html) component, an optional [MeshCollider](https://docs.unity3d.com/ScriptReference/MeshCollider.html) component to allow for physics interactions, and optional [MeshRenderer](https://docs.unity3d.com/ScriptReference/MeshRenderer.html) & [Material](https://docs.unity3d.com/ScriptReference/Material.html) components to render the mesh.

Normals

As ARKit is constructing the mesh geometry, the vertex normals for the mesh are calculated. Disable [normals](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_normals) if you do not require the mesh vertex normals to save on memory and CPU time.

Concurrent Queue Size

To avoid blocking the main thread, the tasks of converting the ARKit mesh into a Unity mesh and creating the physics collision mesh (if the [meshPrefab](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_meshPrefab)'s game object contains a [MeshCollider](https://docs.unity3d.com/ScriptReference/MeshCollider.html) component) are moved into a job queue processed on a background thread. [concurrentQueueSize](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARMeshManager.html#UnityEngine_XR_ARFoundation_ARMeshManager_concurrentQueueSize) specifies the number of meshes to be processed concurrently.

Other ARMeshManager settings

For the ARKit implementation, only these 3 settings affect the performance and output of ARKit meshing.

Meshing behaviors

Do note that it is typical for about 4 seconds to elapse after the meshing subsystem starts before the scanned meshes start to show up.

Additionally, the LiDAR scanner alone may produce a slightly uneven mesh on a real-world surface. If you add and enable an [ARPlaneManager](https://docs.unity3d.com/Packages/com.unity.xr.arfoundation@4.0/api/UnityEngine.XR.ARFoundation.ARPlaneManager.html) to your scene, then ARKit considers that plane information when constructing a mesh. ARKit smooths out the mesh where it detects a plane on that surface.

5 Likes

How do you make the different colors for classification show up in the material colors?

EDIT: found the answer in this sample scene in the arfoundation-samples project on github
https://github.com/Unity-Technologies/arfoundation-samples/blob/master/Assets/Scenes/Meshing/ClassificationMeshes.unity

It looks like Cloud Builds fail because of missing XCode 11.4 support, needed for ARKit 3.5?

1 Like

My understanding is that the Cloud Build team is working on upgrading the Xcode Cloud Build to 11.4.

1 Like

Hello, I wanted to try ARKit 3.5 support, so tried building using AR Foundation 4.0 Preview 3 + AR Subsystem 4.0 Preview 3 + ARKit XR Plugin 4.0 Preview 3 on Unity 2019.3.14f1 compiled for iOS using Xcode 11.2.1.

The app builds fine, but then AR did not work. The Camera Permission did not popup, and when tried to set it manually the application is not listed in Settings-Privacy-Camera. Tried forcing the Camera Permission (Application.RequestUserAuthorization(UserAuthorization.WebCam)) but did not work.

Then, rolled back to 3.1.3 and it worked fine, however I undestand that would not be using ARKIT 3.5 and no Occlussion support.

Any recommendations?
Thanks!

Hey guys, great work with Version 4 preview.

I’m trying to use my own material to render the mesh. However if I use any kind of texture with my material, the mesh renders a solid black color.

This occurs with or without the Texture Coordinates checkbox set on the ARMeshManager.

Is there something specific I need to do within my shader to make this work?

I’m having the exact same issue, I made a post about it here: https://forum.unity.com/threads/transparent-shader-not-working-on-ar-mesh-ipadpro.899066/

I believe it’s due to the UV’s on any AR Mesh not being generated correctly, even if the option UV is toggled on on the ARMeshManager. I’ve submitted a bug report, but their reply was clueless. I wish there was a better way to get shit fixed quickly in Unity! Forums and bug reports are both too slow!

Thats what I figured. Have you tried saving some of the mesh data to a file somewhere, then debugging in the editor?
That was going to be my next step.

No I haven’t, that’s a good idea though.
The next solution I was gonna try was basically making a shader that uses world space UV’s instead of the real ones. But certain effects are still not possible until the UVs are generated properly. For example faded edges etc…

What I (and probably 95% of AR devs) want from Unity by default is one shader for AR Planes and Meshes that can do all of the following at the same, and also toggle each on and off at any time via shader parameters:

  1. render shadows without rendering the surface
  2. block shadows without casting shadows (stop shadows from going through doubled over surfaces aka a table over the floor)
  3. Occlude itself
  4. render a repeating texture but have soft faded edges near the edge (a previous shader in the ARCore package did this)

the issue is that a lot of AR devs out there don’t have the advanced shader writing skill to make one shader that can do all this performantly. Why make everyone repeat that work or be forced to hire a shader programmer? When they could just be scripting their app.

1 Like

I’ve also run into this issue. Enabling Texture Coordinates(or the rest of the properties) in ARMeshManager doesn’t work for me either. Are there any news on this?

As I mentioned in the original post, only the three following ARMeshManager settings affect the ARKit implementation:

  • MeshPrefab
  • Normals
  • ConcurrentQueueSize

ARKit does not support any of the other possible features exposed by ARMeshManager. ARKit does not provide Texture Coordinates, for example.

Wow, ok I missed that detail. In that case, it would be nice to have a little note there in the inspector next to each feature that says like “currently Android/Magic Leap only” or something like that

2 Likes

Does anyone know if we want to generate the polygon meshes like in Apple’s example (Visualizing and Interacting with a Reconstructed Scene | Apple Developer Documentation), we only need to change the MeshRenderer and Material?

2 Likes

Hello, can we save any scan data or mesh object via ARFoundation?

Thank you in advance!

1 Like

@todds_unity – I’ve been testing ARKit / ARMeshManager uptime and found some issues worth noting:

  • Meshing has a memory growth rate of at least 5MB/minute (typically 10-20MB/minute), even if the device is in a fixed position+orientation viewing an unchanging physical location.
  • In the same fixed position as (1), the generated meshes tend to grow incrementally larger over time in terms of vertex count. Approximately 1-20 vertices per mesh, per minute.
  • ARMeshManager.DestroyAllMeshes does not reclaim any memory caused by (1) or reduce vertex count as seen in (2).
  • Despite the memory growth rate and consuming over 4.3GB RAM, the process memory limit on an iPad Pro 4th Gen has never been hit in testing. After 15-30 minutes, the app will 100% hang in Unity’s main thread. The Xcode debugger callstack almost always points to a semaphore wait, often within retrieval of ARSession currentFrame. Example call stack below. Other times it will just be in Unity platform-agnostic API wrapping a semaphore “acquire” implementation.
  • 1, 2, 4 do not occur if ARMeshManager is disabled (no memory growth/leak, no inevitable hang).

Setup:

  • Unity 2019.4.7f1
  • ARFoundation 4.1.0-preview.6
  • ARKit XR Plugin 4.1.0-preview.6
  • Xcode 12 beta 3
  • iPadOS 14 beta 3
  • iPad Pro 12.9" 4th Generation

I will test again with the latest iPadOS 14 and Xcode 12 beta release (beta 5 at time of post) but wanted to provide some comprehensive initial feedback.
Sample native Xcode callstack:

Thread 1 Queue : com.apple.main-thread (serial)
#0    0x000000019839eecc in semaphore_wait_trap ()
#1    0x0000000198267454 in _dispatch_sema4_wait ()
#2    0x0000000198267aec in _dispatch_semaphore_wait_slow ()
#3    0x00000001d8851cb0 in -[ARSession currentFrame] ()
#4    0x000000010945ba20 in -[SessionProvider afterUpdate] at /Users/todd.stinson/Work/arfoundation/com.unity.xr.arkit/Source~/UnityARKit/SessionProvider.m:291
#5    0x000000010a50dca0 in VirtActionInvoker2<XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, Configuration_t47C9C15657F4C18BE99ACC9F222F85EB9E72BF43>::Invoke(unsigned int, Il2CppObject*, XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, Configuration_t47C9C15657F4C18BE99ACC9F222F85EB9E72BF43) [inlined] at <redacted>/Classes/Native/Unity.XR.ARSubsystems1.cpp:116
#6    0x000000010a50dc80 in ::XRSessionSubsystem_Update_m40F8405ECB47FDC56B0B203F09655E3E5F637EFB(XRSessionSubsystem_t9B9C16B4BDB611559FB6FA728BE399001E47EFF0 *, XRSessionUpdateParams_tAA765EB179BD3BAB22FA143AF178D328B30EAD16, const RuntimeMethod *) at <redacted>/Classes/Native/Unity.XR.ARSubsystems1.cpp:13669
#7    0x000000010a4ec3e4 in ::ARSession_Update_m5F845E6E9DACEF91167155BA894CBA73AFB5BED6(ARSession_tFD6F1BD76D4C003B8141D9B6255B904D8C5036AB *, const RuntimeMethod *) at <redacted>/Classes/Native/Unity.XR.ARFoundation1.cpp:26165
#8    0x00000001080bfd60 in RuntimeInvoker_TrueVoid_t22962CB4C05B1D89B55A6E1139F0E87A90987017(void (*)(), MethodInfo const*, void*, void**) at <redacted>/Classes/Native/Il2CppInvokerTable.cpp:55531
#9    0x00000001094024e8 in il2cpp::vm::Runtime::Invoke(MethodInfo const*, void*, void**, Il2CppException**) at /Users/builduser/buildslave/unity/build/External/il2cpp/il2cpp/libil2cpp/vm/Runtime.cpp:545
#10    0x0000000108b51ba0 in ::scripting_method_invoke() at /Users/builduser/buildslave/unity/build/Runtime/ScriptingBackend/Il2Cpp/ScriptingApi_Il2Cpp.cpp:285
#11    0x0000000108b5fcf4 in ::Invoke() at /Users/builduser/buildslave/unity/build/Runtime/Scripting/ScriptingInvocation.cpp:273
#12    0x0000000108b6dba4 in Invoke [inlined] at /Users/builduser/buildslave/unity/build/Runtime/Scripting/ScriptingInvocation.h:68
#13    0x0000000108b6db90 in CallMethodIfAvailable [inlined] at /Users/builduser/buildslave/unity/build/Runtime/Mono/MonoBehaviour.cpp:424
#14    0x0000000108b6db60 in ::CallUpdateMethod() at /Users/builduser/buildslave/unity/build/Runtime/Mono/MonoBehaviour.cpp:528
#15    0x00000001086f6f18 in UpdateBehaviour [inlined] at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:178
#16    0x00000001086f6f0c in ::CommonUpdate<BehaviourManager>() at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:156
#17    0x00000001086f6de8 in ::Update() at /Users/builduser/buildslave/unity/build/Runtime/GameCode/Behaviour.cpp:173
#18    0x00000001088e34c8 in ::Forward() at /Users/builduser/buildslave/unity/build/Runtime/Misc/Player.cpp:1461
#19    0x00000001088da4e0 in ::ExecutePlayerLoop() at /Users/builduser/buildslave/unity/build/Runtime/Misc/PlayerLoop

^ This still occurs with iPad OS 14 beta 5 + Xcode 12 beta 5.

The 4.1.0-preview.7 set of AR Foundation packages are available and fix the meshing memory leak issue.

2 Likes

Hello, I’m testing ar mesh samples and want to change the material of the mesh. When I changed the Material of RenderedMeshPrefab to Unity built-in Default-Material, the mesh has been rendered black. Is there anything I’m missing?
Thanks in advance.

@VictorChow_K Hey, thanks a lot for such a detailed bug report. I’ve been having this issue for quite some time. And in my experience it still persists in .7 preview. Have you tried testing it with the latest ARFoundation version? If yes and it was successful can you please share what version of Unity did you use.

Hi - I still saw the hang in .7 preview.

2 Likes