I have the following ‘Health’ code attached to an actor which is a prefab empty game object with a mesh for the player and a standard cube called “HealthBar”:
//Run in Edit Mode
@script ExecuteInEditMode()
var maxHealth=100;
var curHealth=100;
function adjustCurrentHealth(adj){
curHealth += adj;
if(curHealth<0)
curHealth=0;
if(curHealth>maxHealth)
curHealth = maxHealth;
var hb : Transform = transform.Find("HealthBar");
var scale : float = curHealth/maxHealth;
Debug.Log(scale);
hb.localScale = Vector3(scale,0.1,0.1);
}
The cube for the health bar defaults to scale: 1, 0.1, 0.1
If I manually change the scale to something like “0.22, 0.1, 0.1” using the Inspector, then it scales perfectly… However, with the code above, the Debug.Log is telling me “0” for the scale even though I know for fact that curHealth is 60 and maxHealth is 100 therefore scale should be 0.6. It seems to be constantly dropping the decimal even when I try to force the scale to be a float… I cannot for the life of me get the scale set to a decimal number BETWEEN 0 and 1 through script, yet it works when set manually… I am at a complete loss as to why Unity would be so inconsistent and as such, I have no clue what I can do about it… Any ideas??