(bug?) Render the screen using a color renders the wrong color?

Hi, i was doing a screen shader and i noticed how the color of my effect was a little off. First i thought i was something to do with my effect but after a bit of tinkering i discovered that if i have the most simple screen shader possible that only renders a single given color returns the wrong color all the time.

I tried using different types (fixed half and float) and it did nothing

then i tried change the color space and i noticed that if the color space is linear the color always gets brighter and if it’s gamma it always gets darker

here’s a simple screen shader for you to try it:

Shader "Hidden/Test" {
    SubShader {
        Tags { "RenderEffect"="Opaque" }
        Pass {
            ZTest Off Cull Off ZWrite Off Blend Off Lighting Off
            Fog { Mode off }
           
            CGPROGRAM
            #include "UnityCG.cginc"
            #pragma vertex vert_img
            #pragma fragment frag
           
            half4 _TestColor;

            half4 frag(v2f_img i) : COLOR
            {
                return _TestColor;
            }
            ENDCG
        }
    }
}

and the script (you’ll need screen shaders pro assets imported):

using UnityEngine;

[RequireComponent (typeof(Camera))]
[ExecuteInEditMode]
public class Test : ImageEffectBase {
	public Color color;

	void OnRenderImage (RenderTexture source, RenderTexture destination){
		material.SetColor("_TestColor", color);
		Graphics.Blit (source, destination, material);
	}
}

You can reproduce the problem by setting a camera and pick a test color somewhere in the middle of the brightness value, then keep using the color picker and pick the color from the Game view and you’ll see the color values decreasing or increasing depending of your color space

what i’m a doing wrong, how can i stop this? is this a bug?

That’s to be expected on multiple counts. First off, half and fixed are 100% equivalent to float on desktop so you’re probably playtesting in Game View, so these halfs and fixeds will all be floats. They basically “exist” only for non-desktop platforms nowadays. Secondly, within the range 0…1 their precision should be sufficient to “look” just exactly like a full-precision color if the fixed/half/float does represent a color. All 3 types have the most precision between 0 and 1, their difference really is at what point afterwards and how badly they degenerate. That’s a good heuristic to work with for colors and perhaps normals, but if you’re deforming UV coords (whether in vert or in frag) for example you’ll quickly see on a mobile device how half/fixed can break down badly, so use float for texcoords at first to be safe.

Sure, that’s how they work. It will be helpful for you to study (it’s somewhere in Unity docs) the difference between (and the reason for the existence of) the two color spaces. Now, unless you’re targeting iPhone 3 or something or never ever have any lighting/illumination going on, in 2014 there’s absolutely ZERO reason to ever use the Gamma space.

Your shader is obviously fine, given that it doesn’t do much at all.

Here’s what you need to know: AFAIK SetColor() in Linear Space may pre-linearize the specified color floats before sending them to the shader, in Gamma space it should theoretically send them unprocessed as given. Your final resulting screen will also, in Linear Space, be de-linearized / gamma-corrected automatically after the last shader has finished and just before the final picture is blit to screen, in Gamma Space I’m not sure it does so (never had a need for using that color space) but I guess not.

If you cannot invest the time quickly learning about the 2 color spaces in Unity and how they affect colors sent to shaders, textures bound to samplers, and the processing of the final resulting screen, then here’s a quick hack for you — in the following line:

        material.SetColor("_TestColor", color);

Instead of color, send either color.linear or color.gamma – pick the one that fits the result you’re looking for. This isn’t really the proper way to approach the topic, but if you’re pressed for time or something…

Thx for the reply, i do know what linear and gamma space is

My project is in linear space, what i didn’t know is the whole de-linearized before setColor, i thought unity would output the result in linear space by not doing any additional calculations and didn’t know i could send color.linear and color.gamma

Unfortunately this doesn’t seem to do much for me so i guess i need to convert the value myself inside the shader

So, my project is in linear space with HDR turned on, so by default unity sends the values in linear space and only does gamma correction with the backbuffer according to unity’s documention, that way:

  • In theory passing the color with color.linear is the same as color and i shouldn’t had to do anything to get the right results (my results: color gets overbright, passing with color.linear gets darker at a lower rate)

  • In theory passing the color with color.gamma should require conversion inside the shader (de-gamma) (my results: color.gamma gets overbright, correction with pow(color, 2.2) is still overbright

this is obviously flipped and i’m probably missing a conversion somewhere else, but even assuming that i’m missing up conversions and color spaces i then tried inputting using linear and gamma and correction it with pow(color, 2.2) or pow(color, 1/2.2) but none of them worked

the best case i got so far was by using the normal color input and do a color correction using pow(color, 2.2), in this case the color go darker but a very slow rate, for instance the color (0,128,255,255) would be transformed to (0,128,254,255)

So i’m still not sure what am i doing wrong, or what is unity doing wrong, it’s ridiculous to think i can render a camera using it’s solid color property and it outputs the right color but as soon as is do it with a simple shader it changes up even if just a little

thx for the insight on the various variable types using the desktop

[EDIT]: I talked too soon, even unity’s camera messes up the colors, simply rendering a camera with a solid color and then using the color picker to pick the color from the game tab changes the color, for instance, the color that i was using before (0,128,255,255) becomes (0,127,254,255) which is the equivalent to one of my results of using color.linear as input and no color corrections

could be the color picker problem? or is it an approximation problem?

PS: i said before that color.linear was the same as color but i was wrong obviously

Both color.linear and color.gamma return a value different from color. A “color” is just 3 floats, no one knows except the author which “color space” it is meant to be in. So both color.linear and color.gamma perform an operation on the color and return the result, one “linearizes” the 3 floats and you call it if you think the color is “gamma-corrected” (like from a jpg photo grabbed from the web, or a cheaply-“authored” texture provided by a hobbyist), the other gamma-corrects for final screen display (you never need to do that really since this happens at the end of the frame just before blit-to-screen) what you think is a linear color value.

Then it always gamma corrects the value? if the project is in linear space and the input of the color is just 3 floats why would try to gamma correct the value on the output? its basically impossible to output a specific color like this

Both the color.linear and the color.gamma properties perform a simple mathematical conversion of the 3 floats regardless of Unity project settings AFAIK. They’re only there “in case a developer ever needs to linearize or delinearize 3 rgb floats”. That’s why I indicated those helpers are probably not the real solution to your issue.

Let’s go back to basics. Your project is in Linear space. That means:

  1. any samplers and colors accessible in any shader code are all pre-linearized by Unity

  2. the very final resulting image will be gamma-corrected just before blit-to-screen by the GPU

Here’s an idea, if you want to send over a specific float-triplet, say Magenta (1,0,1) use SetVector instead of SetColor to be on the “safe side”. This doesn’t get mangled outside your control. It will arrive as you send it when your shader is running in Linear space. Of course it will still be gamma corrected as per step 2. above.

Thx, that actually helped me understand the problem better, the previous color i was using before in float is (0,0.5,1,1) and using that as an example the output was different because of the gamma correction, but it was correct if i used the pow(color, 2.2)

one strange effect though, i tried my script from the first post on a different computer using unity 4.6 beta and correcting the input using color.linear and in that computer everything stayed just fine as predicted, no approximation errors, so i’m assuming it might have some to do with unity 4.5 (it could also be my pc fault)

thx for the help, the confusion cleared up :wink: