So the script consist of a really simple tapping game so you know “tap += tapsperclick” (ex) and i have them set to floats. But when i go add them in the game as i tap everything works until it gets to .8, i just jumps to 0.9999999999 instead of following its preset of 0.0001 addidtion that i made. This is what it looks like as i add them and get close to that number. Remember i just want to add 0.0001 to 0.0000
You should probably use ints to count clicks. Is there any problem with them?
I wanted to add zero point of something, int work fine but i would rather use floats because im trying to ad by point
Floats are imprecise. You can internally store clicks as ints, increment them by 1 and to get desired decimal numbers display them like this
var clicks = 99;
var decimalPlaces = 3;
var tempPrecisionNumber = (int)Math.Pow(10, decimalPlaces);
Debug.Log(String.Format("{0}.{1:smile:" + decimalPlaces + "}", clicks / tempPrecisionNumber, clicks % tempPrecisionNumber));
// Displays 0.099
1 Like
Floats are by nature inexact, in order to make them as performant as possible. They’re estimations. If you need to do arithmetic with them, you need to expect a certain level of imprecision in the result. They’re not used for financial calculations for good reason.
Related: never do == equality checking between floats unless you know you set their values explicitly, rather than as the result of a calculation.