Learning shaders here. I want to produce a greyscale image that represents depth in my scene from my perspective camera. Black being closer, white being farther away from the camera.
I’ve looked at tens of threads about the subject, I just got more and more confused…
Like for ex there’s the concept of writing a shader that outputs depth and setting it as a replacement shader for the camera, there’s the concept of outputting depth into a render texture, and there’s the concept of setting the camera’s depthTextureMode to Depth and then accessing _CameraDepthTexture from the shader code, too many ways I couldn’t get any of them to work ![]()
Like for ex, this shader similar to the one in the depth texture doc:
Shader "Hidden/RenderDepth" {
SubShader {
Tags { "RenderType"="Opaque" }
Pass {
Fog { Mode Off }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct v2f {
float4 pos : SV_POSITION;
float2 depth : TEXCOORD0;
};
v2f vert (appdata_base v) {
v2f o;
o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
UNITY_TRANSFER_DEPTH(o.depth);
return o;
}
half4 frag(v2f i) : COLOR {
UNITY_OUTPUT_DEPTH(i.depth);
}
ENDCG
}
}
}
A lot of people claimed to have this work for them. I just don’t know wtf to do with it lol - I tried setting it as a replacement shader for the camera, the output was purely black…
I also tried the replacement shader example project, there was a RenderDepth shader (similar to the one above) and a js RenderDepth script to attach to the camera and use the shader as replacement, I attached the script to my camera but again, black output…
Any help would be extremely appreciated!
Thanks!
