Bug? Camera Depth sorting issue when sending to Render Texture

Using a plain old camera (no custom projection) and sending it out to a render texture, and something reeaaaly funky is happening to the depth sorting.

This is what the camera sees:

This is what RenderTexture gets:

I’m using Metal on MacOS, and had the thought it probably has something to do with the usual ZDepth flip between graphic platforms. So I tried using Unity - Scripting API: GL.GetGPUProjectionMatrix
GetGPUProjectionMatrix just turns it into this:

I’ve even tried messing with the shaders taking the RenderTexture to no avail, so my assumption is that this needs to be fixed on the camera projection end of things? Would swapping far and near clip do the trick here? Just a bug? Any help much appreciated !! :face_with_spiral_eyes:

I’m on LWRP Forward Rendering if that helps!

RenderTextures have an optional depth buffer. Make sure your RenderTexture has the depth buffer enabled so your objects know who’s in front!

It also wouldn’t hurt to make sure your shaders read and write depth values.

Hope that was it, good luck if it isn’t.

1 Like

YES that was it!! Turns out you can only get a depth buffer on a render texture by creating it at runtime?? I had it as an asset in the project instead. Very strange, but it’s working great now. Thanks a bunch for the insight!! :slight_smile:

I’m confused, depth buffer settings are the third setting on a render texture asset, after the resolution and MSAA.

You’re right. It turns out I created a “Custom Render Texture” instead of a regular one where that setting is missing :roll_eyes: