Advice needed: Screen capture from OnRenderImage

Hi,
I’m capturing my screen using AsyncGPUReadback.Request from the OnRenderImage callback, as proposed in various bits of documentation I have seen. The capture works but the image is very dark by comparison to what I expect (and what I see on the display).
I am reading the buffer as a byte array rather than using unity types like this:

outputBundle.image = req.request.GetData<Byte>(0).ToArray();

and using the bytes directly to set colors in my png encoder like this:

                pngUtility.pixels[i].red = buffer[i * 4];
                pngUtility.pixels[i].green = buffer[(i * 4) + 1];
                pngUtility.pixels[i].blue = buffer[(i * 4) + 2];
                pngUtility.pixels[i].alpha = buffer[(i * 4) + 3];

where i points to the pixel.

is there some other conversion I should be doing?

Update:
I am getting the same thing if I use this:

var buffer = req.request.GetData<Color32>();
var tex = new Texture2D(outputBundle.width, outputBundle.height, TextureFormat.RGBA32, false);
tex.SetPixels32(buffer.ToArray());
tex.Apply();
File.WriteAllBytes(outputBundle.destinationFile, ImageConversion.EncodeToPNG(tex));

So, yes it turns out there is indeed another conversion I needed to be doing. I have my camera set to allow HDR, which affects the format of the pixels. I don’t know what the precise format is but I solved it by blitting to a rendertexture with RenderTextureFormat.ARGB32 which forces the correct conversion.