Is ShadeSH9 affected by mesh scale?

I’m working on a system that allows me to take the interpolated light probe information at an objects location, and apply it to duplicates of that object drawn at different locations and orientations via Graphics.DrawMesh calls. This works fine, provided that the duplicate meshes are at normal scale. The lighting on duplicates at other scales is not behaving as it should. Below you can see that the smaller duplicates appear to have harder edged lighting, while the larger duplicates appear to receive more ambient light.


Below is the shader I am using. It’s very bare-bones right now. I’m not really worried about appearances beyond correct behavior at this point. I pass in an interpolated SphericalHarmonicL2 and a matrix to get the equivalent normal on the original for each pixel, then use ShadeSH9 to get the appropriate light. Like I said, this is working perfectly for duplicates with the same scale as the original. I’m confused why scale is making a difference, because the normals are unchanged regardless of scale.

Shader "Custom/FarsideRealtimeShader"
{
    Properties
    {
        _ContextNum("Context Number", Int) = 0
    }
        SubShader
    {
        Tags { "RenderType" = "Opaque" "Queue" = "Geometry+2" "LightMode" = "ForwardBase"} //I'm not sure I still need forward base if I'm using a custom probe.
        LOD 100

        Stencil {
            Ref[_ContextNum]
            Comp equal
        }

        Pass
        {
            Cull Off //Cull is off because some of my meshes might have negative scales.
            CGPROGRAM
            #pragma target 3.0
            #pragma vertex vert
            #pragma fragment frag
            #pragma multi_compile_fog

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
                float3 normal : NORMAL;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                UNITY_FOG_COORDS(1)
                float4 vertex : SV_POSITION;
                float3 worldPos : TEXCOORD2;
                float3 worldNormal : TEXCOORD3;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;
            uniform float4x4 _Matrix; //Matrix is a targetSpace -> originalSpace conversion matrix. Used for mimicing the original orientation of the object.

            v2f vert(appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                UNITY_TRANSFER_FOG(o,o.vertex);
                float3 copyNormal = UnityObjectToWorldNormal(v.normal);
                o.worldNormal = mul(_Matrix, copyNormal); //Orient the normal to the originals space.
                o.worldPos = mul(unity_ObjectToWorld, v.vertex).xyz;
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {
                fixed4 col = tex2D(_MainTex, i.uv);
                half3 light = ShadeSH9(half4(i.worldNormal, 1.0));
                col.xyz *= light; //rgb seems to work alright too. Need to research the difference.

                // apply fog
                UNITY_APPLY_FOG(i.fogCoord, col);
                return col;
            }
            ENDCG
        }
    }
}

I’m fairly new to shaders, so any pointers are appreciated.

ShadeSH9 and the other spherical harmonics related functions all expect a normalized float3 vector as the xyz of the input. Unnormalized vectors will scale the brightness by the vector’s magnitude.

Got it. That is good to know, and shouldn’t be a difficult fix.

Something I read suggested that the 1 in the 4th element was supposed to normalize the vector, but that doesn’t make much sense. What is the impact of the 4th element? I’m trying to figure out if I need to normalize the half3 or the half4.

The w of 1.0 has nothing to do with normalization here. For spherical harmonics it’s just to get the L0 band color. At no point does the w even touch the input normal.

The ShadeSH9 function is ancient. Really they should have had an override that took a float3 input that auto-appends the 1.0. Lots of other built in Unity functions work that way, or even outright ignore the w component even when you do pass in a float4. It could be argued that leaving it exposed is useful if for some reason you wanted to scale the brightness of the ambient light … but really you’ll probably just multiply the returned ambient color instead. But basically the function has existed mostly unchanged for the better part of the last decade and wasn’t really written with ease of use in mind.

Normalize the xyz of the world normal, append a 1.0. Done.

2 Likes