Decimal Operations

Alright, I have a question about how unity deals with decimal operations because it seems odd at times.

#pragma strict

var float1 : float = .4;

var float2 : float = .5;

function Start () {

Debug.Log( float1 - float2 ); 					// Yields: -0.099999999
Debug.Log(".4 - .5 = " + ( .4f - .5f ));		// Yields: -0.1
Debug.Log(".4 - .5 = " + ( .4 - .5 ));			// Yields: -0.1

}

The yeild: is what is returned in the console. I’m not understanding the first line giving a -0.09999… as its value. I searched around and only found: Int Vs float operations thread, however since this script specifically declares the numbers as floats I’m a little at a loss. I had heard (over a year ago on a tutorial somewhere) that you should toss a letter, like the “f” used in the second debug to ensure proper mathematical operations. Can anyone explain this to me simply?

-0.099999999999 is as close to damnit -0.1 and that’s the problem with floats, they give you results like this all the time. It’s why you should never compare them with ==