So I’m working on a cel-shader with outlines made using the camera depth texture (retrieved via shadergraph’s Scene Depth node), and I’ve noticed some undesirable behavior at the edges of the object:
These errant pixels seem to be the result of the camera depth texture not reflecting the additional transparent/blended pixels generated at the edges of the object.
What’s even stranger, is that with anti-aliasing off, I get flickering in the game view while not playing in editor, and when I AM playing in editor, this happens:
It would appear that the depth texture from the scene view is being used instead of the one for the game view.
When I build the project to eliminate this cross-talk, I find that the depth texture lags behind by one frame, (which is an issue visible when the object is in motion). but ONLY when I have anti-aliasing turned off on the URP asset. Otherwise, with anti-aliasing on, the build has the same initial issue as the editor.
I already have a workaround in mind, but it would be nice to understand what’s actually causing this issue and solve it at the root.
Assume any shader that uses the Scene Depth node will not work perfectly with MSAA enabled. That is sampling the camera depth texture, which is always rendered without MSAA.
MSAA works by doing multiple (depth) coverage samples per pixel, and if more than one triangle is visible in the coverage samples, the color of those triangles at that pixel location are blended together. However since the depth texture doesn’t use MSAA, there will be pixels where the the sphere edge was visible to a coverage sample, and thus blended with the background color for anti-aliasing, but not visible in the depth texture.
For something like your outline effect using the depth texture, there’s no way for this to work properly with MSAA.
I figured as much, which is why I turned off MSAA, but then I get the other issue which is that unity seems to be a frame behind on the depth texture, but ONLY when MSAA is off. I can only assume this is a bug, since the behavior is different in editor than in build. In editor, the depth texture being used for the game view is the one from the scene view, while in build the depth texture is a frame behind. (I say this because you can clearly see it lag behind the slow motion of an object, leaving bright pixels at the leading edge of the object.)
For now I’ve created a workaround by rendering out my own depth texture using a SRP override renderer on a separate camera, but it seems a shame if there’s a built in texture that’s either generated anyway or generated faster.
I’ve tested this in both URP 7.3.1 and 10.2.2, and this problem exists in both. I checked the patch notes for 10.3.1 and didn’t see anything about this issue (although there were a few changes relating to MSAA). I’ll test it on the newest version soon and then submit a bug report if the issue still exists.
The project this is for is on Unity 2019.4.12 and URP 7.3.1 and I’m weary of getting the entire team to stop what they’re doing to update their unity versions. It’s likely I would have needed to generate my own depth texture anyway, since I’ll likely be needing to do some additional work to maximize depth contrast per model, rather than keeping scene-wide depth.
(By the way, thanks again for your help! I’ve run into many of your posts on the forums while learning about Unity shaders and I’ve learned a lot from you.)