Using AROcclusionManager and scaled ARSessionOrigin occludes everything

Hello,

I made a post about this on Friday, but it seems to have disappeared… not sure what happened there!

I’m very excited to use the new AROcclusionManager and ARCore Depth features, however I’m experiencing an issue whenever my ARSessionOrigin is set to a localScale other than 1.

We have some large objects and playarea, 10s of metres across, so we scale the ARSessionOrigin up to make things appear smaller to the camera. (localScale = 50, 50, 50 for example.) However, this seems to have the effect of occluding everything, like the depth map isn’t incorporating the scale inherited by the ARCamera (at least correctly) and the game objects all fail the z-test. When I set my ARSessionOrigin to a localScale of 1, my objects do appear and are occluded correctly (but are massive).

Is this a known bug, and if so, will it be fixed soon? Is there a workaround (use a custom material for the background, and hack the shader)? Or have I made some other mistake?

Thanks!

I have managed to make it work, using a custom material for the background, workaround steps are:

  • Find your ARCameraBackground component, and tell it to use a custom material
  • Dig up the ARCoreCameraBackground.shader from your ARCore package, and make a copy in your project. (Renaming it is a good idea, so you can tell them apart.) In the function ConvertDistanceToDepth(), apply a scalefactor that is the localscale of the ARSessionOrigin. I’ve also accounted for a small bias, to avoid z-fighting:

float ConvertDistanceToDepth(float d)
{
float zBufferParamsW = 1.0 / _ProjectionParams.y;
float zBufferParamsY = _ProjectionParams.z * zBufferParamsW;
float zBufferParamsX = 1.0 - zBufferParamsY;
float zBufferParamsZ = zBufferParamsX * _ProjectionParams.w;

// Compensate for camera scaling
d = (d * _DepthScale) + _DepthBias;

// Clip any distances smaller than the near clip plane, and compute the depth value from the distance.
return (d < _ProjectionParams.y) ? 1.0f : ((1.0 / zBufferParamsZ) * ((1.0 / d) - zBufferParamsW));
}

  • Create a material that uses this shader, then set it as the custom material in the ARCameraBackground.
  • Send through your _DepthScale and _DepthBias values at runtime to the shader using Shader.SetGlobalFloat().

There’s a very similar procedure for ARKit.

Hope this helps someone else, and even more I hope this workaround becomes redundant soon. :slight_smile:

Same problem here, please fix it, thanks!

Is this working for you with URP?

Yep, we have it working with the URP (using the workaround).