RESOLVED: GUITexture.a getting multiplied by 2...

Hey guys,

Pretty new to using the GUI in Unity, so bear with me.

I make a simple GUITexture, all defaults (except size/pixel offset), alpha is set to 128. I apply this script to it…

var golemSquare : GUITexture;

function Start()
{
	golemSquare.color.a = 0.4;
	Debug.Log(golemSquare.color.a);
}

And set it as golemSquare. Now, the alpha should be just under half visible, right?

Wrong. It’s actually at 0.8, despite the fact that the Debug.Log returns it as 0.4. If I set the value to anything greater than (or equal to) 0.5, it’s fully opaque.

I find this puzzling as the Unity Color docs say that the value should be between 0 and 1…and in pretty much anything else, this holds true.

This is kind of frustrating and I have no idea why it’s happening…feel like I missed something here. There’s no other scripts attached to this object, nothing except the one there, and it’s not parented to anything. No other GUI functions are running…help!

The shader used for GUITextures deliberately uses a doubled combine mode, which allows for overbrightening. As a consequence, .5 is “full” value.

–Eric

Ahh okay, thought I (or Unity) was going crazy there for a second. Good to know that it wasn’t a bug or a larger problem.

Also, geez leweez Eric, that’s the second post I’ve made today that you helped me with. You need some sort of forum award or something.