Hey guys, I’m pulling my hair on this problem. Had some legacy code running fine on Quest 2 + OpenGL + BRP, but getting strange artifacts when I try to ReadPixel and save an image.
Unity 2021.3 LTS, URP 12.1.7
Basically, theres a secondary camera whos output texture is set to a rendertexture (settings below). And I have a RawTexture in the scene to show me what the camera is rendering.
This is attached to the cameras output texture.
Before I explain further, here are what the artifacts look like
I can’t share the actual screenshot due to workplace privacy stuff. But this is what the problem looks like. The “section A” labelled above changes in size each time I capture a frame, but it is always a black rectangle. Section B is the rest of the image shown normally.
Some things I’ve tried but didn’t work:
- Retrieve swapchain late
- Changed swapchain buffers to 2
- Tried AsyncGPUReadback
- Tried changing RenderTexture format
The code essentially looks like this:
//rt is the RenderTexture attached to the camera
var texture = new Texture2D(rt.width, rt.height, rt.graphicsFormat, TextureCreationFlags.None);
//This is a new RT created for debug purposes on the GPU side
var screenshot = new RenderTexture(rt.descriptor);
RenderTexture.active = rt;
texture.ReadPixels(new Rect(0,0,rt.width,rt.height), 0, 0);
texture.Apply(); //Shown in their documentation for some reason?
RenderTexture.active = null;
//Saving the frame where we read pixels, the camera is drawing to a RawImage
Graphics.CopyTexture(rt, screenshot);
_camera.targetTexture = screenshot
var bytes = texture.EncodeToJPG();
File.WriteAllBytes(_path, bytes);
The frame is shown correctly on the debug target texture, which leads me to believe that the problem lies when reading pixels from the GPU. Any insight or fixes for this problem?