Color.a won't be shown/set correctly for values between 1 an 244

Hi folks,

like the title says, I got a little problem with the chaning of color.a.
I’ve got a shader on my object that supports transparency, and when I set the color.a manually in the inspector, it works fine. If I try to change it via code, it does not.
What annoys me the most: It SAYS it works.

I got a little script that should fade out an object like this:

	public IEnumerator CIconFadeOut(Transform Icon, float Fadetime){
		float duration = Fadetime/20;
		Color color = Icon.renderer.material.color;
		Color colorbild = Icon.FindChild("Iniicon").renderer.material.color;
		for (int i = 1; i < 20; i++){
			print("alpha = " + 255/i);
    		color.a = 255/i;
			colorbild.a = 255/i;
            Icon.renderer.material.color = color;
			Icon.FindChild("Iniicon").renderer.material.color = colorbild;
			yield return new WaitForSeconds(duration);
    	color.a = 23;
		colorbild.a = 23;	
        Icon.renderer.material.color = color;
		Icon.FindChild("Iniicon").renderer.material.color = colorbild;
		print ("Alpha: " + Icon.renderer.material.color.a); = "changed";

The unity prints for the color.a value the correct 23 - so far so good, but when I look in the scene view, it still shows me the 255. It appears he can only jump from 0 to 255.
(Show or show not, but nothing between). But if thats the case, why won’t he even apply the value of 23 in the inspector? If I replace 23 with 0, he changes it. Also, if he aparently won’t change anything, why does he say he changed it? It makes no sence to me.

Thanks for any help.

Yeah pretty easy fix. It says 0 - 255 in the inspector, but in code you have to use 0f - 1f.


0f == 0
0.5f == 128
1f == 255