Using a plain old camera (no custom projection) and sending it out to a render texture, and something reeaaaly funky is happening to the depth sorting.
I’m using Metal on MacOS, and had the thought it probably has something to do with the usual ZDepth flip between graphic platforms. So I tried using Unity - Scripting API: GL.GetGPUProjectionMatrix … GetGPUProjectionMatrix just turns it into this:
I’ve even tried messing with the shaders taking the RenderTexture to no avail, so my assumption is that this needs to be fixed on the camera projection end of things? Would swapping far and near clip do the trick here? Just a bug? Any help much appreciated !!
YES that was it!! Turns out you can only get a depth buffer on a render texture by creating it at runtime?? I had it as an asset in the project instead. Very strange, but it’s working great now. Thanks a bunch for the insight!!