Shader illumination doesn't change with object rotation if more than one object exists

I recently started writing on my first shader ever.

It’s a 2D texture + normal map sprite shader. In a first pass it handles ambient lighting and the scene’s single directional light and afterwards it does an additional pass for each other light in the scene.

The additional passes function exactly as intended. The first pass also functions exactly as intended if only one object using the materal exists in the scene.
However, if more than one object using the material exists, the illumination for the directional light doesn’t change with the object’s rotation anymore.
If you change the direction of the light, the object illumination still reacts accordingly, but a 90° rotatet sprite will still have the same pixels illuminated as a 0° rotated one and I have absolutely no idea what might cause this.

This is the code for the shader’s first pass:

Pass
        {
            Tags{ "LightMode" = "ForwardBase" }
            CGPROGRAM

            #pragma vertex vert
            #pragma fragment frag
            #pragma target 2.0
            #include "UnityCG.cginc"

            uniform float4 _LightColor0;

            // User-specified properties
            uniform sampler2D _MainTex;
            uniform sampler2D _Normal;
            uniform float4 _CustomColor;
            uniform float _Shininess;
            uniform float4 _SpecColor;
   
            struct VertexInput
            {
                float4 vertex : POSITION;
                float4 vertexColor : COLOR;
                float4 uv : TEXCOORD0;
            };
   
            struct VertexOutput
            {
                float4 pos : POSITION;
                float4 vertexColor : COLOR;
                float2 uv : TEXCOORD0;
            };
   
            VertexOutput vert(VertexInput input)
                {
                VertexOutput output;
                output.pos = UnityObjectToClipPos(input.vertex);
                output.uv = input.uv;
                output.vertexColor = input.vertexColor;
                return output;
            }
   
            float4 frag(VertexOutput input) : COLOR
            {
                float4 diffuseColor = tex2D(_MainTex, input.uv);
               
                if (diffuseColor.r > 0.5 && diffuseColor.g <0.1 && diffuseColor.b <0.1)
                {
                    diffuseColor.xyz = input.vertexColor.xyz;
                }
                diffuseColor.a *= input.vertexColor.a;

                float3 ambientLighting = UNITY_LIGHTMODEL_AMBIENT.rgb;

                float3 lightDirection = normalize(_WorldSpaceLightPos0.xyz);

                float3 normalDirection = (tex2D(_Normal, input.uv).xyz - 0.5f) * 2.0f;
                normalDirection = mul(float4(normalDirection, 1.0f), unity_WorldToObject);
                normalDirection.z *= -1;
                normalDirection = normalize(normalDirection);

                float normalDotLight = dot(normalDirection, lightDirection);

                float specularLevel;
                float diffuseLevel;
                if (normalDotLight < 0.0f)
                {
                    specularLevel = 0.0f;
                    diffuseLevel = 0.0f;
                }
                else
                {
                    float3 viewDirection = float3(0.0f, 0.0f, -1.0f);
                    specularLevel = pow(max(0.0, dot(reflect(-lightDirection, normalDirection),viewDirection)), _Shininess);
                    diffuseLevel = normalDotLight;
                }

                float3 diffuseReflection = diffuseColor *_LightColor0 * diffuseLevel * diffuseColor.a + diffuseColor*ambientLighting;
                float3 specularReflection = _LightColor0 * _SpecColor * specularLevel * diffuseColor.a;

                return float4(diffuseReflection + specularReflection, diffuseColor.a);
            }
            ENDCG
        }

There are a couple of things wrong with this line (and the lines around it in general.

You’re correctly, perhaps intentionally, perhaps by mistake, applying the inverse transpose object to world transform to the normal here. Inverse because you’re using unity_WorldToObject, and transpose because you’re using mul(vector, matrix) instead of mul(matrix, vector). This is good.

What’s bad is the normal is a direction, not a position. The fourth component of the vector should be zero, not one, or you should by using a float3x3 matrix. That “1.0” is essentially saying “apply the translation from the matrix” which is something you never want for a direction.

float3 worldNormal = mul(float4(objectNormal.xyz, 0.0), unity_WorldToObject).xyz;```

or

```float3 worldNormal = mul(objectNormal.xyx, (float3x3)unity_WorldToObject);```

Which you choose doesn't really matter. On some platforms one might be faster than the other, but most of the time the shader compiler is going to recognize this and compile to the "correct" one for that GPU regardless of which you choose.

But that's not the biggest problem. The biggest problem is a normal map is in _**tangent space**_, not object or world space. That seemingly arbitrary "z *= -1" is needed because these are different spaces, and Unity's transform coordinate space (be it object or world) is a different handedness than the OpenGL style tangent space Unity uses.

For a single planar object, like a sprite or quad mesh, you can get away with using the object's transform matrix and flipping the z like you're doing as the x and y axis otherwise line up. When you have two objects with the same material Unity will attempt to batch them. This means merging both meshes into a single mesh pre-transformed into world space. That means the hack you're using by applying the object's transform to the normal stops working as a batched mesh's transform matrix is always an identity matrix (ie: no rotation, uniform 1 scale, no translation). For what you're doing to work you should be properly using a tangent space rotation matrix to transform from tangent space to world space. When Unity batches meshes it will also transform the normal and tangent vectors to ensure they continue to work for this exact reason.

Search for Tangent Space Normal Mapping and you should get some examples to help you. On Unity's vertex fragment shader example page they have a shader which implements tangent space normal maps. Look for "Adding more textures" and the shader just above and below both have examples of calculating the tangent to world matrix and applying it (as three dot products). The resulting normal is used for reflections rather than lighting, but the rest is the same.
https://docs.unity3d.com/Manual/SL-VertexFragmentShaderExamples.html

Thank you, that helps me out a lot.

The normal transformation hack is something I saw on several tutorials for sprite shaders and they usually mentioned that this is a shortcut that only works in 2D, but none of them said anything about the batching issue.

The only thing that still confuses me a bit is that I used the same hack for the second pass and it doesn’t seem to cause any issues there. Is this because Unity doesn’t batch objects together for additional passes?

Try using the frame debugger to step through and see what Unity is doing.