[SOLVED] Change ZTest value for objects rendered in a custom pass.

In the outline custom pass example objects are rendered into a custom buffer with their original material (say “Lit”). The only problem is, they can’t be rendered with the ZTest as it is set to “Equal” as is default for the Lit shader and Forward rendering pass. I want to create an effect, where an entire mesh slowly fades away when it starts blocking the camera, but I don’t want to use Transparent materials for it, instead I want to stop rendering the mesh at once when it crosses the camera and render it instead with the custom pass. I will then blend the custom render Texture into my scene with a variable alpha and be a happy camper. All I need right now is the ability to set the ZTest value for the Lit material to LEqual when the mesh is being rendered into my custom render texture with the custom pass.

I know that the Forward rendering path renders every mesh twice (first into the depth buffer and then into color buffer) and I would ideally want to prevent this (I would wish my models were rendered with one pass with ZTest set to LEqual).

Ok I figured out that you can influence what pass the pipeline will use for rendering with ShaderTagId array that is passed to the RendererListDesc(). However, I cannot make the pipeline render my model in two passes properly: it seems to render the depth properly with the DepthOnly pass, but then it either renders the same depth values into the color buffer with the Forward pass or simply renders nothing, resulting in both the depth buffer of my RT as well as the color buffer containing the same values in the red channel:

I have no clue how to fix this.

I opened the code for DrawRenderersCustomPass and found out about stateBlock that goes into the RendererListDesc. I set the state block properly and changed the ZTest value, but still the result is a red object. This is due to the fact that once I set depth bits on my allocated RenderTarget:

        OutlineBuffer = RTHandles.Alloc(
            Vector2.one, TextureXR.slices, depthBufferBits: DepthBits.Depth32, dimension: TextureXR.dimension,
            colorFormat: GraphicsFormat.B10G11R11_UFloatPack32,
            useDynamicScale: true, name: "Outline Buffer"
        );

the whole RenderTarget turns into a depth render target:

6136698--669432--upload_2020-7-27_11-48-41.png

How do I attach a depth buffer properly? If I set no depth bits, the model is rendered without the ZTest as if there is no depth buffer attached.

Ok I got it: the depth buffer must be a separate render target, allocated alongside the color buffer. The color buffer must have depthBits set to none. Then these two render targets are set via the CoreUtils.SetRenderTarget() function and then everything works.

1 Like

Can you share a simple working example? I am struggling to handle depth correctly using the new RTHandle system

@JSmithIR dude it was way too long ago, I was a different man back then :smile: