Having a bit of an annoying problem…
Using Unity’s _CameraDepthNormals texture in a shader works great. From what I’ve read, Unity generates this by rendering with a replacement shader. When I attempt to do the same thing using a camera rendering with Unity’s same built-in depth/normals shader, the depth appears to be incorrect. Normals are fine. Also, if I render only the depth with the depth replacement shader the depth works fine. Camera is a copy of the main camera. I’ve attached a picture showing the problem…
Yeah, my first suspicion was that it’s a texture format issue but I’m using ARGB32 as per what the Unity manual says it uses for it’s internal depth/normal generation. I’m using the Camera-DepthNormals shader to render the normals into RG and 16-bit depth into BA. It almost appears like I’m getting 8-bit depth… like there’s not enough information as it gets exponentially worse with distance.
I don’t think the clip planes should be a problem as I’m copying everything from the main camera.
Just wanted to update this with the answer to the problem —
It did indeed turn out to be a texture format problem. Changing the format to ARGBHalf removed the banding shown above. I have no idea why the texture format needs to be ARGBHalf as the Unity docs explicitly say that it’s ‘meant’ to be ARGB32 with 8-bits per channel (normals encoded in r and g channels and 16-bit depth packed into two 8-bit channels (b and a)) …not 16.
I’m fairly certain that even though it’s packed into two channels, the depth value from DepthNormalsTexture is just an approximation. Switching to ARGBHalf vastly improves the precision, but it’s unlikely that the depth values are exactly the same as what is in the actual depth buffer.
Yep I can confirm that. Even though switching the format to ARGBHalf improves the precision significantly and has removed the banding, it is still less precise than the _CameraDepthTexture (either native or manually rendered) …but thankfully good enough for my purposes.