Support for GUITexture rendering from second camera

I have created a small Unity App using Unity iPhone, this has a main camera, with a GUITexture element.

I am rendering the output of a second camera to the GUITexture via a render texture.

This all works fine when played in the IDE, but when I build and deploy to the iPhone the GUITexture renders in it’s background colour and no camera image is displayed.

Is this supported in Unity iPhone? If not are there any other techiques I can use on the iPhone to render a second camera onto a flat surface that is then viewed in the main camera?

An thoughts appreciated.

Tanks

Andy

iPhone hardware doesn’t do render-to-texture.

–Eric

pBuffer is supported on the iPhone, so render 2 texture actually is supported.
Just not render targets as used in Unity as it seems

Guys

Thanks for the feedback, I thought that might be the reason.

Do you know of any other techniques that could be used to provide a camera based HUD on iPhone?

Thanks

Andy

There is no way to render it somewhere else and for performance reasons, you wouldn’t attempt it as this will ensure that your GUI is eating a fairly large chunk of performance already.

Simplest clearly is using GUI.xx as it is or creating a gui basing on geometry infront of the camera.