Simple shader to write to z-buffer works on Windows but not Mac. Anyone know why?

I need a shader that writes a constant depth value to the z-buffer for each pixel belonging to a simple quad. The idea is that I render a 2D background image first to fill the screen, then cut out various square holes in it using this shader I’m attempting to write and then finally render some 3D objects into the holes. I don’t know much about shaders but I’ve cobbled together something that works fine in the Unity editor on my Windows machine but doesn’t seem to write to the z-buffer on my Mac. Does anybody know why or have some suggestions to try? Thanks!

Shader "Custom/Erase Z-Buffer"
{
    SubShader
    {
        Tags{ "Queue" = "Geometry-1" }

        ZWrite On
        ColorMask 0
        ZTest Always

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"

            struct v2f
            {
                float4 position : POSITION;
            };

            struct fragOut
            {
                float depth : DEPTH;
            };

            v2f vert(appdata_base v)
            {
                v2f o;
                o.position = UnityObjectToClipPos(v.vertex);
                return o;
            }

            fragOut frag(in v2f i)
            {
                fragOut o;
                o.depth = -100000;
                return o;
            }
            ENDCG
        }
    }
}

Generally when you clear the depth would use a value of 1. However, as you may have noticed, you needed a negative number for this to work on Windows. As for why it’s not working on Mac, the difference is really between DirectX and OpenGL. Unity is using a trick to increase the depth accuracy, but because of differences between DirectX and OpenGL, the trick doesn’t work with OpenGL. That trick Unity is using is to reverse the depth so that near is 1, and far is 0, where as normally it would be near at 0 and far at 1. As for why you would do that, you can search for reversed depth buffer precision on your favorite internet search engine. They don’t do it for OpenGL because the depth goes from -1 to 1, and reversing it go from 1 to -1 wouldn’t do anything useful.

What that means is for both DirectX and OpenGL the far depth is normally 1, but with sometimes with DirectX Unity uses a far depth of 0.

Luckily, there’s already a value you can use to know if the depth is reversed or not. So, the solution is quite simple:

#if defined(UNITY_REVERSED_Z)
o.depth = 0;
#else
o.depth = 1;
#endif

Now you might still be wondering why -10000 worked on Windows, but did nothing at all Mac if the range is 1 to 0, or -1 to 1? I believe this is because -100000 is in front of the camera still on Windows, but gets clamped to 0, where as -100000 on Mac is behind the camera, and might thus be getting thrown away before it would have been clamped to -1. That’s just a guess on my part though.

2 Likes

That worked! Awesome, thanks a million!