Surface Deferred Toon Shading

Hey everyone,
I am stuck with a lighting function problem. If anyone can point me to some documentation or has any tips I’d be super happy.

I wrote a post processing Camera Shader that compares depth values to generate outlines around the meshes.

The meshes use a surface Shader with a custom lighting function to have hard shadows and distance based lighting from point / spot lights:

inline float4 LightingToon (SurfaceOutputCustom s, fixed3 lightDir, half3 viewDir, fixed atten)
{        
    half NdotL = dot(s.Normal, lightDir);
    half rim = dot(viewDir, s.Normal);

    atten *= 2 * NdotL;

    half adjustedAtten = 0.0;
    #ifdef USING_DIRECTIONAL_LIGHT
        adjustedAtten = step(0.5, atten);
    #else
        float sharpAtten = max(distance(_WorldSpaceLightPos0.xyz, s.WorldPos.xyz) / atten, atten * 4);
        adjustedAtten = saturate(lerp(sharpAtten, pow(atten * 2, 1), 0.1));
    #endif
   
    float4 baseCol = float4(s.Albedo * _LightColor0.rgb, 0);

    float4 litSurface = lerp(float4(0,0,0,0), baseCol, atten);
    litSurface = lerp(litSurface, _OutlineColor, (1-rim) * _OutlineStrength); // rim

    return litSurface;
}

Here it becomes tricky, because the mesh shader also does some vertex displacement in my custom vertex:vert program. (Meshes are “melting”, in my case.) However, when using a custom lighting function AND a custom vertex program, the depth buffer does not get updated (?), and so the outlines of the post processing Shader are still in the “unmelted” shape of the mesh, while the mesh it self is “melted”.

So I dug deeper and had a look at the Unity Shader cgincs, and found that this can be fixed by defining a Lighting<Name>_Deferred function, where I can call UnityStandardDataToGbuffer(...) to update the depth for the post processing. But since the parameters of _Deferred are not the same anymore, my lighting calculations (using lightDir and atten) cannot be done anymore.

It might be I am missing something stupid here, but is there an easy way to:

  1. Have a custom lighting function and
  2. have a correctly updated g buffer for post processing
    at the same time?

There might also be a way in that I have to rethink how to do toon lighting in _Deferred lighting function? I am kinda lost…

Any help is great!

If you’re using a custom lighting function, you probably don’t want to be using deferred lighting. Unity’s deferred lighting only supports Unity’s ‘Standard’ shading model. So there’s no way for your toon shader to use the _Deferred light function and retain its toon shading … ignoring replacing Unity’s deferred lighting system (which is an option).

However there is another easier option. Objects that aren’t using deferred compatible shaders are still rendered into the gbuffers. But they use their shadow caster pass to fill in the depth and albedo. By default Surface Shaders don’t generate a custom shadow caster pass, so it’s using the one in the Fallback shader. Really the only thing you need to do to get your melting Surface Shader to show properly in the depth is add addshadow to your #pragma surface line. As long as you don’t need the normals from the gbuffers, that’ll get you what you need.

Otherwise if you need normals you’ll want to disable deferred rendering entirely and modify the Internal-DepthNormalsTexture.shader to add a custom render type that includes your custom vertex code. Or modify deferred shading system to allow it to do toon shading.

Thanks for your reply, bgolus.

Interestingly, my Shader already uses addshadow, and the shadows cast by the mesh are updated correctly with the vertex program, while the depth is only updated correctly if it exceeds the depth of the mesh before vertex deformation. Very strange. Maybe, the problem is not the mesh Shader, but the post processing Shader?

In the picture below, you can see how the mesh “melts”, and how the vertex deformed shape is perceived in the post processing outline Shader. Notice how the original outline is still visible, though it should not be:

Shader "Custom/Mesh/Melt"
{
    Properties
    {
        _Color ("Color", Color) = (1,1,1,1)
        _MainTex ("Color Texture (RGB)", 2D) = "white" {}

        [Space(32)]

        _OilColor("Oil Color", Color) = (1,1,1,1)
        _OilTex ("Oil Texture (RGB)", 2D) = "white" {}
        _OilTransition("Oil Transition", Vector) = (0.1,0.3,0,0)

        _OilDispX("Oil Displacement X", Vector) = (0,0,0,0)
        _OilDispY("Oil Displacement Y", Vector) = (0,0,0,0)
        _OilDispZ("Oil Displacement Z", Vector) = (0,0,0,0)
        _OilFade("Oil Fade", Range(0,1)) = 0.5
        _OilSphereSize("Oil Sphere Size", Float) = 0.15

        [Space(32)]

        _OutlineStrength("Outline Strength", Range(0.0, 1.0)) = 0.0
        _OutlineColor("Outline Color", Color) = (0,0,0,1)
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 200

        CGPROGRAM
        #pragma surface surf Stylized addshadow vertex:vert
        #pragma target 4.0

        sampler2D _MainTex;
        sampler2D _OilTex;

        struct SurfaceOutputCustom
        {
            fixed3 Albedo;
            fixed3 Normal;
            fixed3 Emission;
            fixed Alpha;
            float3 WorldPos;
        };

        struct Input
        {
            float2 uv_MainTex;
            float3 worldPos;
            float3 vertex;
            fixed change;
        };

        fixed4 _Color;
        fixed4 _OilTransition;
        fixed4 _OilColor;
        fixed4 _OilDispX;
        fixed4 _OilDispY;
        fixed4 _OilDispZ;
        fixed _OilFade;
        half _OilSphereSize;
        half _OutlineStrength;
        half4 _OutlineColor;

        void vert (inout appdata_full v, out Input o) {
          
            fixed change = smoothstep(v.color.r - _OilTransition.x, v.color.r + _OilTransition.x, _OilFade * 2 - 0.5);

            // morph to sphere
            float3 toSphere = normalize(v.vertex.xyz) * _OilSphereSize;
            v.vertex.xyz = lerp( v.vertex.xyz, toSphere.xyz, change );

            // wobble noise
            v.vertex.x += sin(v.vertex.x * _OilDispX.x + _Time.y * _OilDispX.y + v.vertex.y * _OilDispX.w) * _OilDispX.z * change;
            v.vertex.y += sin(v.vertex.x * _OilDispY.x + _Time.y * _OilDispY.y + v.vertex.z * _OilDispY.w) * _OilDispY.z * change;
            v.vertex.z += sin(v.vertex.x * _OilDispZ.x + _Time.y * _OilDispZ.y + v.vertex.x * _OilDispZ.w) * _OilDispZ.z * change;

            // recalculate normals
            float3 tangent = normalize(v.vertex);
            float3 normal = float3(-tangent.y, tangent.x, 0);
            v.normal = lerp(v.normal, tangent, change);

            UNITY_INITIALIZE_OUTPUT(Input,o);

            o.change = change;
            o.vertex = v.vertex;
        }
      
        void surf (Input IN, inout SurfaceOutputCustom o)
        {
            // get real color
            fixed4 realCol = tex2D (_MainTex, IN.uv_MainTex) * _Color;
            fixed4 oilCol = tex2D (_OilTex, float2(IN.vertex.y + _Time.x * 2, IN.vertex.x + _Time.x)) * _OilColor;
          
            // mix with oil color
            realCol = lerp(realCol, oilCol, smoothstep(0, _OilTransition.y, IN.change));

            o.Albedo = realCol.rgb;
        }

        inline half4 LightingStylized (SurfaceOutputCustom s, fixed3 lightDir, half3 viewDir, fixed atten)
        {
            return half4(s.Albedo, 1);
        }

        inline float4 LightingToon (SurfaceOutputCustom s, fixed3 lightDir, half3 viewDir, fixed atten)
        {       
            half NdotL = dot(s.Normal, lightDir);
            half rim = dot(viewDir, s.Normal);

            atten *= 2 * NdotL;

            half adjustedAtten = 0.0;
            #ifdef USING_DIRECTIONAL_LIGHT
                adjustedAtten = step(0.5, atten);
            #else
                float sharpAtten = max(distance(_WorldSpaceLightPos0.xyz, s.WorldPos.xyz) / atten, atten * 4);
                adjustedAtten = saturate(lerp(sharpAtten, pow(atten * 2, 1), 0.1));
            #endif
          
            float4 baseCol = float4(s.Albedo * _LightColor0.rgb, 0);

            float4 litSurface = lerp(float4(0,0,0,0), baseCol, atten);
            litSurface = lerp(litSurface, _OutlineColor, (1-rim) * _OutlineStrength); // rim

            return litSurface;
        }
        ENDCG
    }
    FallBack "Diffuse"
}
Shader "Custom/Post/Colored Outlines"
{
    //show values to edit in inspector
    Properties{
        [HideInInspector]_MainTex ("Texture", 2D) = "white" {}
        _OutlineColor ("Outline Color", Color) = (0,0,0,1)
        _DepthMult ("Depth Outline Multiplier", Range(0,1)) = 0.5
        _DepthBias ("Depth Outline Bias", Range(0,1)) = 1

        _FadeDistance ("Fade Distance", Float) = 25
        _FadeFalloff ("Fade Falloff", Float) = 0.2

        _PixelOffset ("Pixel Offset", Range(0,4)) = 2
        _SaturationBoost ("Saturation Boost", Range(0,0.1)) = 0.025
        _BrightnessBoost ("Brightness Boost", Range(0,2)) = 0.25
    }

    SubShader{
        // markers that specify that we don't need culling
        // or comparing/writing to the depth buffer
        Cull Off
        ZWrite Off
        ZTest Always

        Pass{
            CGPROGRAM
            //include useful shader functions
            #include "UnityCG.cginc"

            //define vertex and fragment shader
            #pragma vertex vert
            #pragma fragment frag

            //the rendered screen so far
            sampler2D _MainTex;
            //the depth normals texture
            sampler2D _CameraDepthNormalsTexture;
            //texelsize of the depthnormals texture
            float4 _CameraDepthNormalsTexture_TexelSize;

            //variables for customising the effect
            float4 _OutlineColor;
            half _DepthMult;
            half _DepthBias;

            float _FadeDistance;
            float _FadeFalloff;

            half _PixelOffset;
            half _SaturationBoost;
            half _BrightnessBoost;

            //the object data that's put into the vertex shader
            struct appdata{
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            //the data that's used to generate fragments and can be read by the fragment shader
            struct v2f{
                float4 position : SV_POSITION;
                float2 uv : TEXCOORD0;
                float4 scrPos : TEXCOORD1;
            };

            //the vertex shader
            v2f vert(appdata v){
                v2f o;
              
                //convert the vertex positions from object space to clip space so they can be rendered
                o.position = UnityObjectToClipPos(v.vertex);
                o.scrPos = ComputeScreenPos(o.position);
                o.uv = v.uv;
              
                return o;
            }

            float Compare(inout float depthOutline, float baseDepth, float3 baseNormal, float2 uv, float2 offset){
              
                //read neighbor pixel
                float4 neighborDepthnormal = tex2D(_CameraDepthNormalsTexture, uv + _CameraDepthNormalsTexture_TexelSize.xy * offset);

                float3 neighborNormal;
                float neighborDepth;
                DecodeDepthNormal(neighborDepthnormal, neighborDepth, neighborNormal);
                neighborDepth = neighborDepth * _ProjectionParams.z;
              
                float depthDifference = baseDepth - neighborDepth;

                depthOutline = depthOutline + depthDifference;

                return neighborDepth;
            }

            //the fragment shader
            fixed4 frag(v2f i) : COLOR{
                //read depthnormal
                float4 depthnormal = tex2D(_CameraDepthNormalsTexture, i.uv);

                //decode depthnormal
                float3 normal;
                float depth;
                DecodeDepthNormal(depthnormal, depth, normal);  

                //get depth as distance from camera in units
                depth = depth * _ProjectionParams.z;              
              
                // preparing depth difference
                float depthDifference = 0;

                // preparing color picker
                half curDepth = depth;
                half lastDepth = depth;
                half2 lineColUV = float2(0,0);

                // check depth with neighbouring pixels
                curDepth = Compare(depthDifference, depth, normal, i.uv, float2(_PixelOffset, 0));
                lineColUV = lerp(lineColUV, float2(_PixelOffset, 0), step(0, lastDepth - curDepth));
                lastDepth = lerp (lastDepth, curDepth, step(0, lastDepth - curDepth));

                curDepth = Compare(depthDifference, depth, normal, i.uv, float2(0, _PixelOffset));
                lineColUV = lerp(lineColUV, float2(0, _PixelOffset), step(0, lastDepth - curDepth));
                lastDepth = lerp (lastDepth, curDepth, step(0, lastDepth - curDepth));

                curDepth = Compare(depthDifference, depth, normal, i.uv, float2(0, -_PixelOffset));
                lineColUV = lerp(lineColUV, float2(0, -_PixelOffset), step(0, lastDepth - curDepth));
                lastDepth = lerp (lastDepth, curDepth, step(0, lastDepth - curDepth));

                curDepth = Compare(depthDifference, depth, normal, i.uv, float2(-_PixelOffset, 0));
                lineColUV = lerp(lineColUV, float2(-_PixelOffset, 0), step(0, lastDepth - curDepth));
                lastDepth = lerp (lastDepth, curDepth, step(0, lastDepth - curDepth));

                half maskingDepth = pow(saturate(lastDepth / _FadeDistance), _FadeFalloff);

                // normalizing depth difference
                depthDifference = depthDifference * pow(_DepthMult * 4, 2);
                depthDifference = saturate(depthDifference);
                depthDifference = pow(depthDifference, _DepthBias * 4);

                // masking
                depthDifference = min(depthDifference, 1-saturate(maskingDepth)) * 2;

                // grab pre-pass color
                float4 sourceColor = tex2D(_MainTex, i.uv);

                // convert from normalized to screen space
                float4 lineCol = tex2D(_MainTex, i.uv + lineColUV * _CameraDepthNormalsTexture_TexelSize.xy);



                // l is "luminance"
                float bw =
                    lineCol.r * 0.2 +
                    lineCol.g * 0.7 +
                    lineCol.b * 0.1;

                half3 gray = half3(bw,bw,bw);

                half3 diff = lineCol.rgb - gray;

                // very high saturation
                diff *= _SaturationBoost * 200;
                half4 saturated = half4(clamp(gray + diff,0.0,1.0),1);
                saturated *= 1 + _BrightnessBoost * 4;
              


                // normalize shadowed to bright color
                float l = lineCol.r * 0.3 + lineCol.g * 0.59 + lineCol.b * 0.11;
              
                float4 color = lerp(sourceColor, saturated, depthDifference);
                return color;
            }
            ENDCG
        }
    }
}

The fact your object shows up at all makes me thing you’re not using deferred rendering at all, meaning adding the _Deferred function shouldn’t do anything for you at all.

Thanks again. After further testing, I have to say yes, you are completely correct. I checked and saw that if I switch to the Scene View debug view “Deferred → Albedo” my Materials all just show up as black. (Default Unity Material are visible, though.)

I got the mesh Shader to be visible in the debug view by changing back to:

#pragma surface surf Standard addshadow vertex:vert

And then, the post processing outlines also work, at least.

Now my question is again, why does the depth pass not work the same in Standard and when I use a custom lighting function, like so:

inline half4 LightingSimple (SurfaceOutputCustom s, fixed3 lightDir, half3 viewDir, fixed atten)
{
        return half4(s.Albedo, 1);
}

SurfaceOutputCustom:

struct SurfaceOutputCustom
{
        fixed3 Albedo;
        fixed3 Normal;
        fixed3 Emission;
        fixed Alpha; // stores IN.change
        float3 WorldPos; // stores position of the pixel in Unity space
};

For some reason that I cannot grasp, the Standard lighting function goes to Deferred, while the custom lighting function goes to Forward?

Because:

Apart from a few novel exceptions, deferred renderers always only support one shading model, because that was kind of a significant factor in what made deferred fast to render to begin with.

One of the few exceptions was the first Destiny game, which you could argue was still a single lighting model for everything. Destiny used something similar to a toon shader that used a texture gradient for the lighting lookup. The key was you could specify which gradient to use per pixel, as it was an offset into a large atlas of gradients.

Some more recent deferred renderers will encode extra data to let you switch between a small handful of shading models, but each one you add makes everything slower, so it has to be used with caution. Unity had two assets, Lux and Alloy, that partially replaced Unity’s deferred renderering to do just that, adding support for translucent & skin rendering.

So much insight, thank you!

So in other words, if I want to retain the vertex:vert deformation and have the correct depth output for the post processing, I have to use Deferred. And if I use Deferred, I have to use the Standrad lighting function. So no toon shading in this scenario.

Or do you think switching to URP would help?

how did you do that collider mask?