Gradients don't work as expected in Linear space

Hi all,
So I am working in a project in Linear space - URP - 2020.1.14.
I have some complex compute shaders and noticed weird behaviour, so I started to do some tests and got results that I was not expecting, I am wondering if anyone could shed some light.

The top one is using a texture
Middle is UVs, U component mapped left to right and converted to linear space.
Bottom is UVs, U component mapped left to right.

The second column is the same but they are each multiplied by 2.

Now the strange thing here is that the top two appear correct on the left, yet when multiplied the white point is at about 70% of the way across the quad where you’d expect it to be in the centre (i.e. .5 * 2 = 1)
Now look at the bottom, the gradient appears incorrect yet the white point is where you’d expect in the centre.

Am I missing something here?

Any help would be much appreciated.

Close, it’s ≅73.54% of the way across. Because that’s where the gradient value equals 0.5 for a linear sRGB gradient converted to linear space.

Lets look at the UV converted to linear case. When you “convert to linear space” you’re more specifically saying “convert this value from sRGB gamma space to linear space”. A value that was 0.5 in the original UV will end up being ≅0.214, and the value that was 0.7354 will end up being ≅0.5. The same is happening for an sRGB texture that’s sampled by the GPU while rendering to a linear space target, the conversion from sRGB gamma space to linear space is just being done by the GPU’s hardware rather than manually in the shader.

If you want the sRGB 0.5 to be at the same spot as the 1.0 after a multiply by 2.0 you need to multiply the original sRGB gamma space value and convert to linear afterwards, though be aware an sRGB gamma space value of 2.0 converted to linear space is ≅4.95. Or you can approximate it with the linear color value and multiply it by ≅4.673 since the “mid point” is that ≅0.214 value. Unity actually has a built in value you can use for this too, `unity_ColorSpaceDouble`, which has an RGB a value of 4.59479380 instead of 4.673, for reasons I’m not entirely sure of. Might be to be more accurate for the sRGB gamma to linear approximation used by the `GammaToLinearSpace` function.

Brilliant, thank you bgolus, appreciate the explanation. I wasn’t sure when the conversion was happeneing what what colour space conversions are implicit on the GPU side.
I’ll give this a go and see how it turns out.
Cheers

OK so something I have discovered is that in the shader the Alpha channel is treated differently.
The alpha doesn’t seem to be converted natively from gamma into linear, is this correct?

It is called sRGB, and not sRGBA.

But yes, for sRGB textures being sampled in a shader when using linear color space, only the RGB channels are converted from gamma to linear space. It’s assumed the alpha is “data” and not “color”, and thus shouldn’t be affected by the gamma conversion.

Amazing, all up and running, appreciate the knowledge