Simple "Mango" UV Quad is wrong in Linear Color Space

Hello everybody

I’ve read extensively through various sources on the Linear Color Space feature in Unity, e.g.

I’m getting a feeling for how it works, but to verify that I’ve created a simple test shader that outputs the UV coordinates a RG (the “mango quad”):

Shader "Unlit/NewUnlitShader"
{
    Properties
    {
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.uv;
                return o;
            }

            fixed4 frag (v2f i) : SV_Target
            {
                fixed4 col = fixed4(i.uv, 0, 1);
                return col;
            }
            ENDCG
        }
    }

Giving me the following result in Gamma and Linear Color Space:


Mango Quad rendered in Gamma Color Space (left) and Linear Color Space (right)

The Linear Color Space (right) is “incorrect” in my opinion, e.g. compared to the first image or Photoshop’s Color Picker. The black is way too off in the bottom left corner.

Through reading the various sources, I’ve tried to manually correct for Gamma, so adding this to the end of the Fragment shader:

                #if UNITY_COLORSPACE_GAMMA
                    return col;
                # else
                    return fixed4(GammaToLinearSpace(col.xyz), 1);
                # endif

Which fixes the gradient in Linear Color Space so it looks like the image on the left.

Questions:

  • Why do I need to “manually” correct for Gamma? According to everything I’ve read (see the linked posts), this is done “automatically” by Unity…??
  • When is the correction applied?
  • How do I know which shaders need this correction line at the end and which don’t?
  • Does the Standard shader do this correction?


When you’re rending in gamma space, it means the color values the shader outputs directly map to the color values that will appear on screen. If you output (0.5, 0.5, 0.5) from the shader it’ll show as RGB(127,127,127) in Photoshop’s color picker.

When you’re rendering in linear space, it means the color values the shader outputs are not exactly what will appear on screen and require a color correction pass before being displayed. If you output (0.5, 0.5, 0.5) from the shader it’ll show as RGB(188,188,188) in Photoshop’s color picker after the color correction is applied.

This color correction happens automatically when the final image is displayed on screen, but it means any color values you output from the shader need to have the inverse applied if you want them to appear as you expect them to. For color textures (ones that have been marked as sRGB) the GPU does this conversion automatically from sRGB “gamma” space to linear space when read in the shader so the value output by the shader is already in the correct color space. But for things like UVs, you have to correct them yourself if you want them to appear as you “expect” them to.

If it’s a color value you’re defining in the shader, you will have to do the correction yourself. If it’s coming from an sRGB texture, or a material property that is a “color” property, then the correction will have already been done for you either by the GPU for textures or by Unity before it’s sent to the GPU.

Yes and no. Again, color textures are automatically handled by the GPU, as well as any color value passed to the shader via color material properties. But the shader isn’t doing either of those directly. The Standard shader doesn’t do any color correction itself since the whole point of using linear space rendering is to get real world accurate light mixing, which happens in linear space.

The very short primer on gamma color space vs linear color space is humans don’t perceive color / brightness linearly. Doubling number of photons hitting our eyes doesn’t double the apparent brightness. Instead something needs to be emitting roughly 4x-5x more photons to appear twice as bright. Gamma color space values are effectively in “perceptual” brightness, so that a value that’s 2x higher appears twice as bright. This is a convenient representation for images both in terms of making it easy to pick color values, as well as for image storage as you’re not wasting a lot of data storing color values that humans can’t perceive a difference between. But that’s not how light works in the real world, so calculating lighting in gamma space leads to shading appear too soft and is easy to get too bright when using multiple lights. Linear color space is closer to replicating the real world meaning the resulting lighting is easier to work with to produce realistic images.

Oh thank you @bgolus , this clears some things. I’ll try to draw some conclusions.

So, in Linear Color Space mode the image is transformed after all shaders are run, going from Linear → Gamma which requires everything output from the shader to be in Linear so it all ends up in Gamma on the screen.

Crazy dense paragraph. But I guess it answers all my questions. Can you tell me if these assumptions are correct?

  • you will get Linear colors when sampling from a texture (e.g. tex2d) iff it is flagged as sRGB.
  • you will get unmodified, raw values when sampling from a non-sRGB texture
  • float4s in Shaders will contain values corrected to Linear iff they are filled by Unity through a property of type Color
  • If you need a conversion in the shader depends on wether you started with a Linear Color or not. Every color created in the shader (e.g. derived from other properties or when visualizing data) should be converted to Linear before using it in in further calculations.

You helped me a lot here. I think a lot of the uncertainty on my side (and maybe others too) is that you almost nowhere read about these conversions. None of the Shader tutorials I’ve done or videos I’ve watched (e.g. Freya Holmérs excellent introductions) mention GammaToLinearSpace even once. At the same time all sources recommend Linear Color Space in Unity for more correct lighting but these also don’t talk about the caveat that when you are not dealing with texture colors in Shaders, you need to convert to linear manually.

I have some quibbles over the phrase “all shaders are run”, but that’s basically correct. I’d say it’s after all draws to that particular render target, as there may be post processing that happens (which uses shaders) after converting that linear space render target and rendering it to a gamma space target. A lot of post process anti-aliasing, like SMAA and FXAA, are done after the conversion. And UI is often still done in gamma space.

They are.

Freya, or indeed most tutorials, aren’t going to talk about it because it muddies the waters a ton when you’re trying to learn the basics of shaders. And most of the time colors are going to come from material properties or textures anyway.

1 Like

Appreciate your help @bgolus ! All clear now.

If you have the time: Do you happen to know what’s the situation on vertex colors? In a way these are “colors” but often used to store other data. Are they modified in any way one should be aware of when working in Linear Color space?

Unity does not modify the vertex data based on the color space being used. Assume vertex colors are raw values, so if they’re supposed to represent sRGB color values, you’ll want to do the gamma to linear conversion on them in the shader, or when you assign them.

Unity’s particle system is the one situation this isn’t true, as it’ll do the gamma to linear conversion to the color values before assigning them on the vertices, so the color values default to already being in linear space. However it limits itself to using a byte per color channel (as is the default vertex color representation) which means it can cause the color values of particles to “pop” on transitions between darker colors due to the loss of precision. If you set vertex colors via c# on custom generated meshes if you use a Color instead of Color32 value it’ll keep the vertex colors as full 32 bit float4 values and you won’t have that problem if you convert them to linear. At least on recent-ish versions of Unity.

1 Like

Thank you!

I am amazed how much internal knowledge you have of Unity in particular. Given the state of documentation it’s mind-boggling how much you know.