Canvas on an overlay camera doesn't render when base camera target is a render texture.

Has anyone else seen this? The canvas is set to “Screen Space - Camera” and renders as expected when the base camera renders to the display. But if you set the base camera to render to a render texture, any canvas on an overlay camera is not rendered. I put an example project up if anyone wants to see what is going on.

https://github.com/johnburkert/RenderTextureTest

I ran into this exact issue. What I had to do was explicitly set the Camera.targetTexture for every overlay camera in my stack. To simplify this I hooked into RenderPipelineManager.beginCameraRendering and just made sure my overlay cameras had the render target set. Alternatively, you could do this when adding them if you are doing that at runtime via the API.

3 Likes

As a followup, I have reported this issue via enterprise support and they have reproduced it and have it marked for fix.

1 Like

great! thanks for the info

Thanks for this info, great thread

Ok, same problem here. So what can you do about it? I can’t really seem to figure out Justin’s answer. Also in my scene Unity 2021.1.17f1 the overlay camera doesn’t render the WORLD SPACE canvas. Pls help

The suggested solution didn’t work for me too, doesn’t matter if screen space or world space, canvas does not get rendered into the render texture =Z

Solution found, it would be good if it was documented somewhere.

In URP, Unity has the policy to render first the cameras that target a RenderTexture: “In URP, all Cameras that render to Render Textures perform their render loops before all Cameras that render to the screen.” (policy for screen rendering)

So, if you have an overlay camera and a main camera and want to render both to a RT, you shall follow the same logic.

overlayCamera.targetTexture = renderTexture;
mainCamera.targetTexture = renderTexture;
overlayCamera.Render();
mainCamera.Render();

This works.