I’m trying to create a scene like there is a screen shows somewhere in the map with a frame in front of it. (You can think scene like security cam view on the monitor and some ui text on it.)
I’m using three cameras for that scene. One is main camera, second one is rendering the other part of the map and saves its view as a texture then i use this texture with a material assigned to display monitor in the scene and third one is an overlay camera attached to canvas with some text and frame in it.
When i enable second camera as main i can see it renders correctly the other part of the map with frame in front of it. But it doesnt render overlay part if i want to save this view as texture using Output Texture by checking it… Basically its not writing overlay data to saved texture.
Try setting the Canvas to render in “Screen Space - Camera”. If your Canvas is set Overlay, it does not gets rendered via the Camera but through Unity’s internal rendering System to the Screen buffer after camera has finished rendering to the texture.
Getting the same thing.
I have a camera rendering the world, and overlay camera with a UI canvas (with “Screen Space - Camera”) and I am rending the main camera to a render texture which I then use in another UI canvas. (I am doing this to have careful control over fixed low resolution and then upscaling to user’s native resolution.). When in this configuration the UI overlay is not rendered. I cannot choose the output texture of the UI camera to be the render texture when it is set to overlay mode, I thought that would help but no dice. When I render the main camera to the screen (no output texture specified, it DOES render the UI overlay camera.)