How do I make a shader produce normals for use in post processing

I ran into an issue after implementing a URP custom post process fullscreen shadergraph shader.
The shader is fairly simple, and uses normal comparisons to draw outlines. It works with default URP shaders.

However, in my project I am using some custom shaders that were written in the Built-in pipeline and then automatically updated when I moved over onto URP. I have included one below as an example.

Based on debugging, I’m pretty sure that the problem is that pixels written by my custom shaders have a normal of 0,0,0 (they are completely clear or completely black based on what normal difference threshold I set in the post-processing shader).

Does anyone know what code I need to put into my shaders to get them to produce normal information in URP? I’ve looked through the URP shaders a bit but they are very hard to parse.

Shader "Custom/NearsideLitShader"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
		_Tint ("Tint", Color) = (1,1,1,1)
		_ContextNum("Context Number", Int) = 0
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" "Queue"="Geometry" }
        LOD 100

		Stencil {
			Ref[_ContextNum]
			Comp equal
		}

		Pass
		{
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			// make fog work
			//#pragma multi_compile_fog

			#include "UnityCG.cginc"

			struct appdata
			{
				float4 vertex : POSITION;
				float2 uv : TEXCOORD0;
				float2 light_uv : TEXCOORD1;
				float3 normal : NORMAL;
			};

			struct v2f
			{
				float2 uv : TEXCOORD0;
				float2 light_uv : TEXCOORD1;
				UNITY_FOG_COORDS(1)
				float4 vertex : SV_POSITION;
				float3 normal : NORMAL;
			};

			sampler2D _MainTex;
			fixed4 _Tint;
			float4 _MainTex_ST;

			v2f vert(appdata v)
			{
				v2f o;
				o.vertex = UnityObjectToClipPos(v.vertex);
				o.uv = TRANSFORM_TEX(v.uv, _MainTex);
				o.light_uv = v.light_uv.xy * unity_LightmapST.xy + unity_LightmapST.zw;
				UNITY_TRANSFER_FOG(o,o.vertex);
				o.normal = v.normal;
				return o;
			}

			fixed4 frag(v2f i) : SV_Target
			{
				// sample the texture
				fixed4 col = tex2D(_MainTex, i.uv) * _Tint;
				col *= UNITY_SAMPLE_TEX2D(unity_Lightmap, i.light_uv);
				// apply fog
				UNITY_APPLY_FOG(i.fogCoord, col);
				return col;
			}
			ENDCG
		}
	}
}

I noticed that there is a separate DepthNormalsPass in URP shaders. Is that something that I need to include? I thought that URP didn’t allow multiple passes, so I was confused to see it.

I think you need to write the normals to the intermediate texture yeah.
Maybe look into full screen shaders graph

I did in fact need to add a DepthNormals Pass to my shaders. With this added, my post-processing shader started getting good normals information and was able to function as intended.

1 Like

The issue was that I wasn’t getting good normal input into my fullscreen shader.

I solved the issue by adding a DepthNormals Pass to my object shaders.Can you confirm if that is the correct/only way to do it in the URP pipeline. It would be nice if I could output pixel color and normal in a single pass.

1 Like

I’ve only done it with the full screen effect render feature. Maybe check the code that uses?

That is a good suggestion. I created a simple unlit shader-graph from template and specified that it would output depth, and it did generate the extra passes. It’s not entirely clear that this wasn’t just something it would do automatically.

I’ve done some more experiments, this time trying to get depth information into my post-processing effect, and ran into similar problems.

To recap, this is what I have figured out.

Problem: My older shaders still work fine in URP, but their normal and depth information is not visible to a Shader-grasp fullscreen post-process effect (even when the render feature is explicitly told to inject that information).

Incomplete Solution: There is a way to fix this. If I add a DepthNormals pass with the appropriate lightmode tag to my shaders, that information will appear in my post-process feature.

Unsolved Problem: I wasn’t getting depth/normals info for objects on the other side of portals. Because I’m working with stencil portals, I render my passes in a specific order: nearside, portal surface, then farside. From what I can surmise DepthNormals passes are done in a prepass, before my portal has written to the stencil buffer, so farside DepthNormals passes were getting culled by the stencil check implemented at the subshader level. I removed the stencil check from the DepthNormals pass, and sure enough I got the depth/normal information, but of course it wasn’t constrained by the portal bounds, so this is not a workable solution.

I’ll have to consider my options. It might be that I just can’t use a fullscreen renderer to do postprocessing.