Using the Camera Depth Texture with Image Effects.

How can I access the camera depth texture in an image effect shader? I’ve tried setting up my image effect script like so:

using UnityEngine;
using System.Collections;

[ExecuteInEditMode]
public class ImageEffect : ImageEffectBase
{
	protected override void Start()
	{
		base.Start();
		camera.depthTextureMode |= DepthTextureMode.DepthNormals;
	}

	void OnRenderImage(RenderTexture source, RenderTexture destination)
	{
		Graphics.Blit(source, destination, material);
	}
}

And my shader code:

Shader "Custom/DepthNormal" {
	Properties {
		_MainTex ("Render Input", 2D) = "white" {}
	}

	SubShader {
		ZTest Always Cull Off ZWrite Off Fog { Mode Off }
		Pass
		{
			CGPROGRAM
				#pragma vertex vert_img
				#pragma fragment frag
				#include "UnityCG.cginc"

				sampler2D _MainTex;
				sampler2D _CameraDepthNormalTexture;
			
				float4 frag(v2f_img IN) : COLOR
				{
					half4 depthNormal = tex2D(_CameraDepthNormalTexture, IN.uv.xy);
					float depth = DecodeFloatRG(depthNormal.zw);
					return half4(depth, depth, depth, 1);
				}
			ENDCG
		}
	}
}

I’ve set the camera to use deferred rendering but I’m not getting any output on my screen, just a blank grey image. Am I setting up the camera depth texture wrongly?

Very old post, but had the same problem recently.

It looks like a camera won’t do a depth pass if the shadows are disabled. Check your frame debugger that a depth pass was made.

If you don’t want shadows, you have to set the camera depth mode to Depth.

This will not affect the editor window, so you have to create a script that affects Camera.current and has the [ExecuteAlways] tag if you want it to.

Also make sure that your render texture has a depth buffer. This is vital, as it appears Unity requires the camera render texture and the image effect render texture to have the exact same properties.