I’m making a drawing / coloring app. I use uGui for stamp/sticker functionality.
I’m trying to use a rendertexture to make a screenshot. But the GUI elements don’t show up on the render texture.
The GUI elements are on a Screen Space - Camera canvas which is rendered by my Main Camera. When I’m taking a screenshot I set the target texture of my camera to a rendertexture, take my screenshot, and reset. Everything shows up in the PNG except the GUI elements. This is my code:
Weird thing is that if I don’t reset the main camera back to rendering to the screen (targetTexture = null) I do get the GUI elements if I save a second time. So I guess GUI elements are rendered sometime later in the render pipeline. But if I postpone everything from mainCamera.Render() onwards to the OnPostRender() event it doesn’t show up so it’s probably even later than that.
A solution I thought of is using 2 cameras and don’t switching the Render Target, but I can’t assign a canvas to more than one camera. Does anybody know another solution for this issue or is it a Unity bug?
I found a forum post stating that an issue was logged for a similar issue (number 631091), but can’t find it back in the issue tracker.
Hi
Any news on this issue ? Seems to me it still not fixed 4.6.0f3. Or may bt there is easy workaround to be able to render to texture uGUI Canvas content ?
The only work-around I’ve found so far is to set the UI canvas to “World Space” rather than “Screen Space” in order to render UI elements into the render texture. Perhaps this is helpful to you? unfortunately in my case it’s just not enough to solve the issue.
It wont even make it to the next patch release my fix got changed when moving to a new branch and it re broke the fix. I want to talk to the guy who changed it to figure out why before i do any other changes.
Yea a whole bunch of internal things happened which caused a lot of UI fixes to not make it into a release quickly. I personally merge in a bunch this week for 4.6.2p1. It should include this issue