Render UI to texture

Is there a way i can render GUI elements like GUI.Button or GUI.Box on a texture? I can do it with GUI Text but it dosnt work with stuff drawn with OnGUI()...

Oh well, I’m pretty sure it can’t be done with the current version (2.6). I’ll have to use GUI Texture and GUI Text.

Hi, welcome to the forum!

I'd like to be proven wrong, but I don't think there is any way of doing this, unfortunately.

Sadly / Happily, OnGUI gui is a post - post render FX, that means its the very last thing drawn in the scene.

As such it does not exist for anything else in the scene renderingwise at all and you can't use it on other textures

The on-screen GUI is rendered the last, that is true.
But graphics libraries allow to render to texture, which can be a dynamically allocated and stored object in memory, thus the very next frame the texture is still accessible and stores what has been rendered to it in the previous frame.
Besides, there can be introduced an architecture, where whole the GUI gets rendered to texture when changes occur, and then those rendered GUI textures used for displaying as on-screen or in-scene faces textures.
Remember those Doom3 and Quake4 consoles with interactive screens all around the scenes?
Another example of in-scene GUI here:

I'm curious about how you perform this actually :)

GUIText is a regular world 3D object

OnGUI isn't a such and does not offer any of the related callbacks.
Its executed on its own after the scene is rendered and finalized (including post fx) and happens past that point. It does not appear in any render texture.

This case does not care the least bit about what graphics api do and allow. You can't use it in Unity thats what matters and what all this is about.

Point taken :) new post created in 'Scripting'

Actually, you can render UnityGUI to a texture.

You need to set to the texture to which you want to write. Here’s the documentation:

Though by this time you have probably implemented your thing with GUIText. This solution was not trivial to find. I don’t see what good reason there is for not rendering Unity GUI into camera textures.

1 Like

facundo, that's a very great trick, thank you!
For me the use is to render some interactive elements into a texture and apply it to a floating box.

1 Like

Sorry to reactivate this thread, but how exacly do I use this in a script?

Yeah, if someone could take the time to explain this a little better it’d be good, I was just about to post a thread asking pretty much the same question as the OP.

You can set up a camera away from the main scene (pointing at a background image on a plane, say) with its GUI layer enabled, while other cameras have the layer disabled. Then, GUIText and GUITexture objects will render on that camera but not on the others. If you set the GUI camera to render to a texture, you can put the rendered GUI onto an object in the scene. Naturally, the GUI elements in the texture don’t respond to user interaction unless you code this behaviour yourself. This technique isn’t generally very useful (it’s usually easier just to draw the desired GUI image in Photoshop) but it is handy for generating text that is gradually “typed” into a texture.

Okay, I was looking at creating some Doom 3 style computer screens throughout a scene that would act as switches for various things (open/unlock doors, elevators, etcetera), I suppose this sort of thing could even be compared to the Pip-Boy 3000 in Fallout; rendering lists and such to a shader rendered in 3D space. As a start point, how does one render the elements generated in ‘OnGUI’ to a single camera ‘or’ layer so that they aren’t rendered through by other cameras present in the scene? Other ideas?

RoxSilverFox: EZGui makes this easy to do, from what I saw.

That’s all well and good, but I don’t have $199 to be throwing around at the moment.