Human occlusion has been added to ARFoundation / ARKit

The latest AR Foundation 3.1.0-preview.1 release introduces a new component AROcclusionManager. Adding this new component to the AR camera in your scene provides depth data to the ARCameraBackground component so that any available depth information may be passed to the camera background shader.

In this initial implementation, the people occlusion functionality of ARKit 3 is used to generate depth information about people detected in the real world. The camera background rendering then uses this depth information, thus allowing the people to occlude any virtual content that appears behind them in the scene.

At this time, only iOS devices that support the ARKit 3 people occlusion functionality will produce occlusion effects with the new AROcclusionManager. These are devices with the A12, A12X, or A13 chips running iOS 13 (or later).

Future devices will be supported when depth map functionality is added to the respective SDKs.

Simply adding the new AROcclusionManager component to the AR camera (along with both the ARCameraManager and ARCameraBackground components) will enable automatic human occlusion to occur on supported devices.

The new AROcclusionManager has 2 parameters: HumanSegmentationStencilMode and HumanSegmentationDepthMode. These two settings allow you to balance the quality of the depth information from ARKit and the performance cost for rendering the occlusion pass.

HumanSegmentationStencilMode has 4 possible values as follows:

  • Disabled - No human stencil image is produced, and automatic human occlusion is disabled.
  • Fastest - A human stencil image with dimensions 256x192 is produced.
  • Medium - A human stencil image with dimensions 960x720 is produced.
  • Best - A human stencil image with dimensions 1920x1440 is produced.

HumanSegmentationDepthMode has 3 possible values as follows:

  • Disabled - No human depth image is produced, and automatic human occlusion is disabled.
  • Fastest - A human depth image with dimensions 256x192 is produced.
  • Best - A filtering pass is applied to enhance the 256x192 human depth image.

Note the previous dimensions/behaviors are produced by the ARKit 3 implementation and are subject to change in future devices and/or ARKit SDK versions.

Modify the HumanSegmentationStencilMode value to alter the boundaries of the human segmentation. Modify the HumanSegmentationDepthMode to alter how the real world depth is measured. Disabling either setting will disable automatic human occlusion.

6 Likes

Looks awesome. Any idea if the Android guys are working on a similar thing so we can down the line (hopefully soonish) get same across AR Foundation for both ios and Android?

I am using stenciling for “portals” (materials have a stencil ref and compare func, and a mask is in front of the camera), will this collide with that functionality? Also, does it play well with LW/URP?

I am using URP with an iPhone 11 Pro and I get a black screen with the latest preview version of ARFoundation and ARKit

Does this work along side human body tracking?

2 Likes

Aside from fixing the black screen issue, I have a question. I am using SSAO as a render feature with URP, is it possible to render the depth stencil on top of post processing effects?

Just upgraded my project to 3.1.0-preview.1 and is still functioning properly (showing camera data as background). I am using the ARPipelineAsset from a branch of the ARFoundation-samples repo, not the default SRPAsset.

Hmm, not sure if I’m doing something wrong because I basically just upgraded an existing project using URP which worked on version 3.0.0. I then created a new project and set everything up to work with URP and still got a black screen. Note I’m using Unity 2020.

Ah, I am on 2019.2.11. I cannot speak to 2020.

For me version 3.1.0 preview 1 works with LWRP but not URP. I have tested both 2019.3 and 2020 with URP. 2019.2 with LWRP works.

ARKit people occlusion and ARKit motion tracking cannot be used simultaneously. This is an ARKit restriction.

1 Like

I would not expect any functionality collision.

The automated occlusion has been tested with LWRP 6.9.2 in Unity 2019.2.12f1 and with URP 7.1.5 in Unity 2019.3.0b12.

The AROcclusionManager provides two texture properties humanStencilTexture and humanDepthTexture that can be used in your custom render passes.

2 Likes

Thank you for the feedback. I’m unable to get the latest preview version of ARFoundation working with URP and I’ve tried both 2019.3 as well as 2020. I made a barebones project just setting up the pipeline and forward renderer with the ARbackground feature and link the pipeline in the build settings. The build gives me a black screen. I am using an iPhone 11 Pro Max

Update, it works on 2020.1.0a14

1 Like

The black screen when using ARFoundation/ARkit 3.1.0 preview 1 with URP is reproducible by enabling post processing on the camera.

1 Like

The forum thread titled “Configuring AR Foundation 3.0.0-preview to work with LWRP or URP” contains a new post regarding URP.

https://discussions.unity.com/t/756403 page-2#post-5292990

Todd

2 Likes

Thank you for the update.

1 Like

There is no update for android yet

1 Like

The ARFoundation works for human occlusion! can i expect it to work for building and other physical objects as well??