Rendering issues with RenderPipelineManager.endCameraRendering

Hi, I’m not sure if this is a bug or not, as I’m not that experienced with Rendering, but I stumbled upon an issue, which I can’t resolve for a long time already.
I’m trying to render with GL commands in URP. I used this example, which itself renders correctly, but I wanted to run rendering code in a callback like Camera.onPostRender. URP ignores this callback, so the closest callback in URP I found was RenderPipelineManager.endCameraRendering. Unfortunately I found out that it for some reason renders incorrectly in Game view, although Scene view is OK. GL primitives are rendered on top of other objects, when in reality they can be behind them. And this happens only if certain settings are enabled in UniversalRenderPipelineAsset:

  • In Quality: HDR and Anti Aliasing (MSAA) are Enabled
  • In ForwardRendererData asset: Renderer Features are used
    and in Camera:
  • In Rendering: Post Processing is Enabled
    This is how it looks like if any of these conditions are met.

    Maybe someone knows how this could be fixed? Or this is a bug and I need to report it?
    Here is the code for reference - it’s basically example from GL documentation, I just added Attribute and 3 methods, as well as redefined one method from example:
[ExecuteAlways]
public class ExampleClass : MonoBehaviour {
     /* ... https://docs.unity3d.com/ScriptReference/GL.html */
     void OnEnable() {
         // Camera.onPostRender += OnRendered;
         RenderPipelineManager.endCameraRendering += OnRendered;
     }
     void OnDisable() {
         // Camera.onPostRender -= OnRendered;
         RenderPipelineManager.endCameraRendering -= OnRendered;
     }
     private void OnRendered(ScriptableRenderContext context, Camera camera) {
         OnRendered(camera); // OnRenderObject() from https://docs.unity3d.com/ScriptReference/GL.html
     }
}

Hey,

By the looks of it you are not telling Universal RP to render a depth texture, either that or it is being cleared post rendering. The reason it works fine in the scene view is that the scene view always produces a depth texture for use for things like Gizmos, Grid rendering etc.

If you’re talking about Depth Texture setting in URP Asset, then I tried Enabling it and it didn’t have any effect.
What else can I try? It’s interesting that the issue happens only if certain settings are enabled in the asset and camera (which I mentioned in first post).

So it makes sense with the PostProcessing, also MSAA could be interfering with it, since the scene also doesn’t do MSAA. If you want to render something at the end, but sill want guaranteed access to depth, best bet is to use the ScriptableRenderFeature/Pass system and queue it before post-processing.

You can create one of these via the Create/Rendering/Universal Render Pipeline/Renderer Feature menu in the project view, it will create a template with the required API needed for injecting some rendering code in the pipeline. Then this feature can be added to the forward renderer asset.

Still hadn’t time to understand how to work with CommandBuffers, but another developer noticed that enabling Opaque Texture setting in URP Asset settings also has the same effect on incorrect rendering in Game view.