Need Help - Scaling Health Bar Object

I have the following ‘Health’ code attached to an actor which is a prefab empty game object with a mesh for the player and a standard cube called “HealthBar”:

//Run in Edit Mode
@script ExecuteInEditMode()

var maxHealth=100;
var curHealth=100;

function adjustCurrentHealth(adj){
    curHealth += adj;

    if(curHealth<0)
        curHealth=0;

    if(curHealth>maxHealth)
        curHealth = maxHealth;

	var hb : Transform = transform.Find("HealthBar");
	var scale : float = curHealth/maxHealth;
	Debug.Log(scale);
	hb.localScale =  Vector3(scale,0.1,0.1);
}

The cube for the health bar defaults to scale: 1, 0.1, 0.1
If I manually change the scale to something like “0.22, 0.1, 0.1” using the Inspector, then it scales perfectly… However, with the code above, the Debug.Log is telling me “0” for the scale even though I know for fact that curHealth is 60 and maxHealth is 100 therefore scale should be 0.6. It seems to be constantly dropping the decimal even when I try to force the scale to be a float… I cannot for the life of me get the scale set to a decimal number BETWEEN 0 and 1 through script, yet it works when set manually… I am at a complete loss as to why Unity would be so inconsistent and as such, I have no clue what I can do about it… Any ideas?? :face_with_spiral_eyes:

Not sure about the conversion rules in UnityScript but in C# you would have to cast both variables to float before the division like:
float scale = (float)curHealth / (float) maxHealth;
I suggest simply to use float values for both variables anyway:
var max Health = 100.0;
var curHealth : float = 100;

Edit for explanation:
the division is executed first with two int variables which result in an int variable which cannot contain 0.6 and is therefore rounded to 1. this 1 is then assigned to the float variable and then it is casted to 1.0.

That’s why you should use c sharp. O_o