Alright, so the basic problem i’m having is that when I switch my camera to deferred rendering, the render targets (which are a render-texture of screen size and the format RenderTextureFormat.Depth
, and a seperate render target also of screen size and the format RenderTextureFormat.Default
(although i’ve also tried ARGB32
) only show the first frame rendered. When I pause the game in the editor, and inspect the camera view, it is correct, and when the camera is in forward rendering mode, the render texture shows what the camera should be rendering, as expected. The only change I’m making is setting the rendering path to be deferred. I am using unity 5.5.0f3 Personal edition.
I set up the camera and render textures as such:
StaticCamera.renderingPath = RenderingPath.DeferredShading;
StaticColorTexture = new RenderTexture ( Screen.width, Screen.height, 0, RenderTextureFormat.ARGB32 );
StaticDepthTexture = new RenderTexture ( Screen.width, Screen.height, 24, RenderTextureFormat.Depth );
StaticCamera.depthTextureMode = DepthTextureMode.Depth;
StaticCamera.SetTargetBuffers ( StaticColorTexture.colorBuffer, StaticDepthTexture.depthBuffer );
it might be important to note that i’m using the Camera.Render call with the camera disabled, as I don’t intend to have this camera render automatically. Am I using SetTargetBuffers wrong? Am I missing something? It seems like other people have had problems with SetTargetBuffers in the past, however, I haven’t found anything that relates specifically to deferred rendering.