Hi,
I’m capturing my screen using AsyncGPUReadback.Request from the OnRenderImage callback, as proposed in various bits of documentation I have seen. The capture works but the image is very dark by comparison to what I expect (and what I see on the display).
I am reading the buffer as a byte array rather than using unity types like this:
outputBundle.image = req.request.GetData<Byte>(0).ToArray();
and using the bytes directly to set colors in my png encoder like this:
pngUtility.pixels[i].red = buffer[i * 4];
pngUtility.pixels[i].green = buffer[(i * 4) + 1];
pngUtility.pixels[i].blue = buffer[(i * 4) + 2];
pngUtility.pixels[i].alpha = buffer[(i * 4) + 3];
where i points to the pixel.
is there some other conversion I should be doing?