Very Short Version
If the Unity Player is set to LINEAR mode, colours passed to shaders via SetGlobalColour will not match the same colours passed via Material Properties
Short version
If you set a material colour property to, for example, #B2007A (0.698, 0, 0.4784)…
And you use Shader.SetGlobalColor() to set a global shader variable to the exact same value…
They will be completely different colours when they arrive at the shader code.
In the screen clip below, I’ve used screen-space toggling to employ either the material property colour or the global colour as the albedo of an object:
The light stripes are where it is using the global colour value, and the dark stripes are where it is using the value from the material property. The exact same colour is being written into both in code.
NB: The colour value is not being sampled from a texture. It is passed in via a material property, so texture colour space should not matter.
If the colour is red or orange, the match is close, but if it contains a lot of green or blue, it’s WAY off.
Long version (why I’m doing this)
As part of improving accessibility in my colour-coded game, I’m allowing the player to assign screen-space patterns to specific colours. The simplest way to achieve this across the board is to detect in the shader if the pixel to be rendered is one of the coded colours and apply the pattern. This worked perfectly for the red and orange colours, but refused to work for yellow, green and beyond. Turns out, this bug is why.
UPDATE: I’ve discovered the cause: LINEAR mode being selected in the Player settings.
If GAMMA mode is chosen instead, colours passed via SetGlobalColor() exactly match those fed via Material properties. Unfortunately this completely ruins all the carefully set up lighting throughout the game, so it’s not an option for me.