From a few days of research on this forum, I’ve yet to find a definitive answer to this problem, though I’m certainly not the only person who’s run into it. However, mine is slightly more specific and could hopefully garner an answer.
The effect I’m currently working on renders out meshes on a specific layer in a lower resolution, pixelating them. It does this after Render Opaques but before Render Transparents. This works very well, as I could using CommandBuffer.Blit to Blend the new Opaque texture onto the old one, and currently as work-around, I can access the CameraDepthTexture and my rendered Depth Texture to manually Z Clip.
However, this messes up Transparents. Since I’m not writing to the Camera’s Z/Depth Buffer, the Transparents always render on top of my pixelated objects. I can easily produce a Render Texture that has the combined Depth Buffers of the Camera and Pixelated Object Layer but as far as I can find, there is no way for me to actually set the value of Camera’s Depth Buffer, without calling DrawRendering or DrawMesh or some other method that isn’t Blit. Even if I set the destination of Blit to the CameraDepthTexture, it won’t change its values.
The closest I can find to a solution is use the ShadowCaster Pass of a custom shader to write to a Z Buffer but the only examples I could find (of which there were only two) only describe changing the vertices of an object in the Z Buffer. Since I’m currently using Blit with a custom shader in order to apply ImageEffects onto the pixelated image, changing the vertices of the quad it’s rendering with doesn’t seem like an option.
I just want to do this on PC Windows, not Mobile or anything else. If I can get it working in some case, with Unity’s SRP, I’ll be happy.
I know likely no-one knows the answer to this, or I’m so green that this level of technical know-how is way above me and I shouldn’t bother, or that I should just give up and use the PPv2 or the Built-in render pipeline. From what I’ve read Unity plans on adding more functionality for generating render data and injection points but it’s a very long way away and I would still really appreciate any help or advice on the subject.
At the very least, if this is just straight up impossible for some reason I don’t know, I hope me asking this question, shows the Unity Devs that there’s one more person that is interested in using and manipulating this data. I’m just trying to get it to work for this effect but I have a lot more creative ideas to try if I can do that.