Hi all, I’d appreciate some help with what I think is a bug with the depth texture not being cleared.
We are using Unity 2020.1 and have implemented a toon effect with an outline, combining multiple guides and tutorials. I think it looks pretty good (full image attached).
The outline effect is based on Alex Ameye in this tutorial: Redirecting… (thanks Alex!)
The effect uses a Renderer Feature to generate a _CameraDepthNormalsTexture for use in the shader, (though I don’t think this directly effects the issue, see below). It then uses a custom shader function to sample the texture to determine the edges of the object and give them a different color, creating an outline effect.
So the issue I’m facing is when rendering an object with this shader to a RenderTexture for UI purposes. I am finding that the render is also coloring pixels that are not part of the edge, and seem to be taken from a completely separate camera (it appears to randomize each time I open the project). I will try to show this with a quick clip moving the main camera around. The chicken exists elsewhere in the scene as is being rendered to the UI via a RenderTexture. Notice the “outline” of the windmill appearing on the chicken:

So what’s going on here? The chicken’s outlines are being rendered properly but also outlines from the main camera.
Things I’ve tried to resolve this:
- Removing the Renderer Feature (no effect)
- Different texture formats for the RenderTexture
- Messing with both Camera’s settings (the main camera as well as the camera capturing to the RenderTexture)
- The pipeline settings, culling masks, etc.
I’m not skilled enough in graphics programming to figure if there is a place the depth buffer needs to be cleared where it isn’t being.
Thanks everyone in advance for the help! I hope it’s a fun challange ![]()
