I have stumbled upon non-documented issue (as far as I searched documentation, internet, and this answers.unity.com page).
When a cube (in my case it’s a cube) is given alpha bigger than one, its colours change. This is odd, because documentation states:
“alpha component (a) defines transparency - alpha of one is completely opaque, alpha of zero is completely transparent.”
So I would expect alpha bigger than one to be truncated to one or at least provide me with error. However what happens is the cube changes its colour.
The alpha channel isn’t automatically truncated because some uses may require values outside of a [0,1] range, even though that may not be required here. As for why the colour changes as the alpha moves outside of the expected range, you would need to look at the blending operation in use that is giving you transparency in the first place.
In the case of a general ‘fade’ transparency, the blending operation would be;
Blend SrcAlpha OneMinusSrcAlpha
What this translates to behind the scenes is;
// Where;
// - SourceColour: the colour of our object
// - SourceAlpha: the alpha of our object
// - DestinationColour: the colour of the scene behind our object
(SourceColour * SourceAlpha) + (DestinationColour * (1 - SourceAlpha))
// or alternatively
Lerp (SourceColour, DestinationColour, SourceAlpha)
If you have a value between [0,1] you can see that the colour resulting from the blend would lie somewhere between the two inputs. However if the alpha has a value outside of that range, then an imbalance in colour would be introduced and you would start extrapolating beyond the input colours.
Oh, thanks a lot @Namey5 , that explains it to me. And, for anybody else, there is info in documentation apparently; I haven’t yet read through, but here is something: Unity - Manual: ShaderLab command: Blend