why does IN.screenPos in a surface shader have different Z values on Windows and Android?

I’m using a surface shader that basically does this :

struct Input 
{
	float2 uv_MainTex;
 	float4 screenPos;
};
  	
void surf (Input IN, inout SurfaceOutput o)
{
	float3 normScreenPos = IN.screenPos.xyz / IN.screenPos.w;

	float fragmentDepth = Linear01Depth(UNITY_SAMPLE_DEPTH(tex2D(_CameraDepthTexture, normScreenPos.xy))); 
	float decalDepth = Linear01Depth(normScreenPos.z); 		

	...
}

This is useful for applying decals to surfaces that have already written to the depth buffer, and stopping the decals overlapping into space.

In the case of a decal triangle that completely maps onto a pre-existing surface, then fragment depth (read from the depth buffer image) should closely approximate the decal depth (interpolated from the triangle vertex positions).

This does indeed work in the Unity editor on Windows.

On Android however, it appears that IN.screenPos.z does not behave the same as it does on Windows, and appears to have an incorrect scaling. the xy values DO seem to be correct however.

Applying a scaling factor, eg. :

float decalDepth = Linear01Depth(normScreenPos.z) * 1.6f; 							

Partly fixes the problem, but you would also need to determine an offset as well as a scale, and also determine the scaling factor to the required precision.

I’m guessing this is a Unity shader bug, so has anyone seen this before, and are there any plans to fix it?

My theory is that this is because the Android device is running OpenGL and the Windows machine is running Direct3D. OpenGL and Direct3D do not map Z depths equally: The xyz mapping range of Normalized Device Coordinates in OpenGL is (-1,-1,-1) to (1,1,1) while it is (-1,-1,0) to (1,1,1) in Direct3D, i.e. the difference is the Z-coordinate: -1 to 1 in OpenGL, and 0 to 1 in Direct3D.

This discrepancy between the two systems is explained and elaborated on in Chapter 4 on Transformations in NVidia’s Cg Tutorial, Section 4.1.9.

I’m not an expert on compiler directives in Cg, but a quick glance inside UnityCG.cginc reveals that you might be able to detect which shader api is in use by these compiler directives:

#if (defined(SHADER_API_GLES) || defined(SHADER_API_GLES3)) && defined(SHADER_API_MOBILE)
// Code for OpenGL here
#else
// Code for Direct3D here
#endif

Maybe you can use those yourself to remap the Z-coordinate based on the device’s shader api?

I also took a look at the Linear01Depth function while I had UnityCG.cginc open anyway. Judging by the function’s name it looks like it’s designed to make up for this exact issue, but it isn’t immediately clear to me what it’s doing. It’s remapping Z based on a float4 called _ZBufferParams declared in UnityShaderVariables.cginc, but what the engine is actually storing in this variable from the CPU side, I don’t know. :-/

Sorry I can’t deliver a totally concrete answer; hopefully this at least contributes to your analysis of the problem.

Looks like you were spot on! The shader works on Android and Windows now with the following adjustment :

#if (defined(SHADER_API_GLES) || defined(SHADER_API_GLES3)) && defined(SHADER_API_MOBILE)
    float decalDepth = Linear01Depth((normScreenPos.z + 1.0f) * 0.5f); 							
#else
    float decalDepth = Linear01Depth(normScreenPos.z); 			
#endif  

I now think that Linear01Depth does give linear values on all platforms, because I’m using the abs delta of the 2 values I get back to alpha fragments out. This seems to work consistently for decals throughout the scene depth, and I wouldn’t expect it to work so well if the values were non-linear. This surprises me, due to having just read this old thread : link http://forum.unity3d.com/threads/39332-_ZBufferParams-values, but I might have misunderstood that.

Thanks for your help!