Why is unity making basic math mistakes (or c#, I don´t know)?

Hi people, it´s just that I´ve been having a weird bug were an object should return to its original Scale but it doesn´t and bad things happen. The localScale.x is meant to be 1, but instead it gives me 0.99999999 and stuff like that.

Even though the current x and original are the same and we are dividing one by the other, the result is not 1:
Debug.Log("currentX: " + currentX + " original: " + originalObjectSize.x + " localScale: " + currentX / originalObjectSize.x);

//which shows

currentX: 1.608788 original: 1.608788 localScale: 0.9999999

This is not the only time that has happened though. First was with a simple operation as well were I subtracted a sum (x -= bothNumbers) instead of x - firstNumber - secondNumber. Mathematically speaking, they should be the same, but the results were slightly different (about 0.0000002).

I know I could solve this by coding without relaying in specific numbers like I am, but it´s hard and it still shouldn´t happen. When the number becomes bigger is easy to just clamp the value, because I know is wrong, but when is a little bit smaller, it´s difficult to know if it´s an error or if that is really the wanted scale.

This is not related to Unity at all. This is a problem that is with the floating point numbers itself. Due to implementation limits on floating point (in this scenario, single point precision), the numbers are not precise.
Read floating point arithmetic for more.