How to get ARGBfloat render texture data without colorspace interfering?

I am really struggling to get correct raw data values from an ARGBfloat render texture.

I am using the render texture to output ‘data’ values which must not be changed in any way by colorspace conversion.

I am setting the camera clear color to a specific value e.g. mid grey.

I am using Texture2D.ReadPixels to grab the render texture into a texture (RGBAFloat texture).

Then I am using either Texture2D.GetRawTextureData or tex.GetPixels to get the data into a Color array.

But there is colorspace conversion happening somewhere. I can’t tell if it’s converting the camera clear color, or whether readpixels is converting it, or getpixels.

The value I write is e.g 0.5, the value I want back is 0.5 but it’s like 0.214 or something depending on the color channel.

When in the project settings I set it to Gamma color space, I get the exact 0.5 values back.

The render texture is set to RenderTextureReadWrite.Linear and the Texture2D it reads to is also set to linear=true. But changing these have zero effect (docs confirms this for float textures).

What exactly is the method for getting correct unchanged float data out of the render texture into a CPU color[ ] array without modifying it, and regardless of the project’s color space setting?

How can I make sure a shader or camera or whatever will output exactly the 0.5 into the render texture and then how can I get that value back out to the cpu? I don’t want any colorspace conversion to be happening anywhere, especially not in writing values to the render texture because I need to use accurate additive blending. I’ve a good mind to trash the whole thing and just do it all on the cpu.

When I try GetRawTextureData and look at the float, it’s been color-space converted, but this suggests at least I’m omitting any conversion introduced by GetPixels which is avoided. Suggest the texture actually has that value in it. But unity docs mentions “hardware” colorspace conversion relating to texture access, so I have no idea if it’s converting it on the fly.

I could apply a conversion to ‘undo’ the mess after getting the data, but this seems hacky.

@imaginaryhuman_1 I was just doing something with RenderTextures yesterday and needed to write values to a RenderTexture and then copy to a Texture2D. I didn’t notice color changes so I thought I’d go check it again now. My project was in Linear mode (using Unity 2018.4 for this.)

I write values in a shader and then use Graphics.Blit in OnRenderImage to write to a RenderTexture and then after that copy the data to a Texture2D using ReadPixels. I did all this in code, no editor made assets, in case it matters. RenderTexture format is ARGBFloat and Texture2D format is RGBAFloat.

Just to test this, I set shader to return 0.5 and then checked what comes out; After ReadPixels and applying the Texture2D to store it, I used GetPixel to get a pixel from this Texture2D and it came out unchanged: RGBA(0.500, 0.500, 0.500, 0.500) in Debug print.

A thought/idea, might not be useful - if you can’t get your changing colors fixed, couldn’t you atleast use that ‘undo’? You could read the UnityEditor.PlayerSettings.colorSpace value in editor and if you need it runtime, you could store that value? Not sure if it’s saved for runtime to some other spot…

Anyway I’m not sure if this helps at all, also I’m not sure how that camera affects this issue…

Thanks. At first I thought something you were saying wasn’t useful or didn’t make sense. Like how could you be getting the right numbers out when you’re seeming to do the same thing as me.

Well, this did however spur me to try something… made a custom shader which output a very specific float value.

It turns our that in fact, with a shader handling a value, it DOES in fact output the correct value, it DOES transfer that correct value to the other texture and it DOES send that correct value to the CPU.

AND, bonus of all boniii, there is no change whatsoever when the project colorspace is switched!

What I see happening is… any kind of “color” field in a shader, or like default colors for a camera, seems to be where there is a colorspace conversion happening. I presume unity does this to make things compatible etc. The camera clear color is converted to a different value. And that’s why I was seeing different values come out than I was expecting.

Fortunately then if you output a specific value in a shader, to the linear-space render texture and float texture, it does in fact retain the precise values you output, and can transfer them to the cpu as well.

So thanks for the info it turned out to be a useful clue.

It’s annoying how longwinded and complicated and hard to figure out these things are sometimes.

2 Likes