Simple vertex shader broken on Metal, works on OpenGL and DirectX

Hello guys,
It’s a couple of days that I’ve been struggling with a very strange issue. I hope someone can help me.

I wrote a vertex shader, which should distort models in a spherical fashion. This is later rendered into a texture. It works on OpenGL without any issue. At first, I had some trouble getting it to work on DirectX, but after a little research, I found out the problem: DX has reversed Z-Depth. I inverted the z pos of my vertex with the old good (1-z) trick and everything works like a charm.

I need this shader to work on Metal too. Here comes the trouble. While I expected the DirectX version to work flawlessy on Metal too, due to the same “reversed-zDepth” thing, it doesn’t work at all.

A little note before getting to the shader code: _ModelViewProj is the Camera.ProjectionMatrix. While I can transform it with the following line:
projMatrix=GL.GetGPUProjectionMatrix (projMatrix, true);
I found out that DirectX does not need it, it works wether it is commented or uncommented.
The issue:
If I use the DirectX version (1-vert.z) on Metal… it’s a mess.
If I use the OpenGl version (assuming vert.z correct) it kind of works, but projection is inverted, from camera far plane to near plane.
The issue is also explained here in an image: in fuchsia what the camera sees, the purple arrow displays the z direction. Long story short, instead of seeing the interior of the house, in Metal I see the exterior.

Here is the relevant code.

#pragma fragmentoption ARB_precision_hint_fastest
#include "UnityCG.cginc"

float4x4 _ModelViewProj;

sampler2D _MainTex;

float4 _VertScale;
float _DistortionShiz;


  

struct appdata {
    float4 vertex : POSITION;
    half2 texcoord : TEXCOORD0;
    #ifdef USE_LIGHTMAP
        half2 texcoord1 : TEXCOORD1;
    #endif
    #ifdef USE_VERTEX_COLOR
        fixed3 color : COLOR;
    #endif
};

struct v2f {
    float4 vertex : POSITION;
    half2 uv_MainTex: TEXCOORD1;
    float3 screenPos : TEXCOORD0;
    #ifdef USE_LIGHTMAP
        half2 uv_LightMap : TEXCOORD2;
    #endif
    #ifdef USE_VERTEX_COLOR
        fixed3 color : TEXCOORD3;
    #endif
};

v2f vert(appdata v)
{
    v2f o;
    o.vertex = mul(_ModelViewProj, v.vertex);

    ///////////
        //On directX i use the following line...
    //o.screenPos.z=1-o.vertex.z;

//...on OpenGL i use this one instead.
        o.screenPos.z = o.vertex.z;

    // normalize screen space positions
    float3 vert = o.vertex.xyz;
    //if I am on directX, i reverse here  vert.z
    //vert.z=1-vert.z;


    float len = length(vert);
    vert /= len + _DistortionShiz;

#endif
//distortioncode
    float div = vert.z+1 ;
    vert.x /= div;
    vert.y /= div;

    // is needed or not?
    vert.z = (len + (_ProjectionParams.y *vert.z)) / (_ProjectionParams.z - _ProjectionParams.y);
    o.vertex.xyz = vert;// * _VertScale.xyz;
    o.vertex.w =1;

    o.screenPos.xy = o.vertex.xy;
    /////////////////
  
    #ifdef NO_DEPTH_SHADER
        o.vertex.z = 1;
    #endif

    o.uv_MainTex = v.texcoord;
    #ifdef USE_LIGHTMAP
        o.uv_LightMap = v.texcoord1.xy * unity_LightmapST.xy + unity_LightmapST.zw;
    #endif
    #ifdef USE_VERTEX_COLOR
        o.color = v.color;
    #endif

    return o;
}

fixed4 frag(v2f i) : COLOR
{

    clip(i.screenPos.z+2);


    clip(1- length(i.screenPos.xy));

    fixed4 color = tex2D(_MainTex, i.uv_MainTex);
    #ifdef USE_VERTEX_COLOR
        color.rgb *= v.color;
    #endif
    #ifdef USE_LIGHTMAP
        color.rgb *= DecodeLightmapFixed(UNITY_SAMPLE_TEX2D(unity_Lightmap, i.uv_LightMap));
    #endif

    return color;
}

Hope someone can pull me off of this mess. Thank you for reading :slight_smile:

1 Like

Edit, for all of those who encounter similar issues, it was enough for me to set ZWrite Off into the shader.

1 Like
#if defined(UNITY_REVERSED_Z)
//code here
#endif

how about this?

Fetching the depth Buffer