Simple question: How to render camera to offscreen buffer then to screen?

I’ve tried all sorts of approaches and still can’t get this to work. All I want to do is render my camera to a render texture and then in OnRenderImage() or after the scene is rendered I want to render the render texture to the screen. I can see the camera is rendering to the render texture by inspecting the render texture but I can never get it to copy to the screen.

This is a snippet of my code for a script attached to the camera:

void OnEnable()
{
     m_camera = GetComponent<Camera>();
     if ( (null == m_renderTexture) || (m_renderTexture.width != m_camera.pixelWidth) || (m_renderTexture.height != m_camera.pixelHeight) )
    {
          if ( null != m_renderTexture )
          {
                 m_renderTexture.Release();
          }

          m_renderTexture = new RenderTexture(m_camera.pixelWidth, m_camera.pixelHeight, 24, RenderTextureFormat.ARGB32);
    }

    m_camera.targetTexture = m_renderTexture;
}


void OnRenderImage( RenderTexture src, RenderTexture dest )
{
       Graphics.Blit(m_renderTexture, (RenderTexture)null);
}
1 Like

OK, so I found this forum post which helped me solve this problem:

They also have a nice demo package which shows how they did their postprocess on IOS to render offscreen and avoid the GrabPixels() performance hit. It doesn’t seem to be documented anywhere AFAIK but it appears you can’t set a rendertexture on the Main Camera and then blit to the screen in OnRenderImage() on that same main camera. So the demo uses two cameras with the Main Camera rendering to the offscreen render texture and then the UI camera blits that offscreen texture to the screen.

http://www.photonworkshop.com/index.php/blog/dof-bloom-package/