Q1: I am trying to understand if its likely that the size of float will be maxed out if it is being set to Time.realtimeSinceStartup and the user pauses the game for a long period of time?
My instinct say its unlikely (but possible), however I do no know how to calculate how long the game needs to run in order to max out float. I know the max size of float however I am confused how the decimals in Time.realtimeSinceStartup (5 decimals) effects the max size of float.
Q2: For testing purposes I have put this in my update (timeThatIgnoresPause = float.MaxValue + 1 ). I was expecting Unity to crash, however it keeps on working. Is this normal or is my code wrong.
The maximum value for a float is roughly 3.4x10^38, so I think you’re good.
Long enough,
In case you wondering on the shear scale of how big this number is, say you counted the number of seconds from when our sun formed, to when it finally burns out. When you graph this timespan to your screen the area this takes up out of float max, the lifetime of our sun in seconds wouldn’t even fill a single pixel on your screen (even if you multiply this value by a 1000).
Awesome, this answers my second questions. Adding 1 to the max of a float will give you float.PositiveInfinity or float.MaxValue, therefore Unity will not crash.
Thanks, I feel silly asking the question after calculating that Time.realtimeSinceStartup will only be 3,153,600,000.00000 after you run the app for 100 years continuously.
It’s more complicated than this, as you need to be concerned with resolution of floats as well as their maximum value.
As you get further from 0 the minimum difference between floating point numbers increases. This is why physics calculations get less accurate the further you go from the world origin. The same thing applies to your game timers. When enough seconds have passed, adding 0.0167f (1 tick at 60hz) will have no effect on the stored number, and your timer will freeze as a result.
So the question isn’t “what is the max value of a float?” It is instead “what is the maximum value of a float that I can reliably add my time deltas to?” The answer to that depends on your time deltas and/or tick rate.
I haven’t done the math myself. That said, I recall reading an article where someone worked it out for their MMO and from memory they found that they’d run out of floating point precision in their timers after about 4 weeks. For the average single player game that’s probably not an issue. For an MMO that was a concern since the idea is to keep servers up for as long as possible, so I assume they’d have done something to manage that (moving to doubles for long running timers would be one solution).
Here’s a blog post with an excellent explanation, and examples of floating point precision for a timer after certain durations: Don’t Store That in a Float.
So after a week our timer is already much longer than a a frame, even if our game only runs at 30hz!
Thanks, awesome, informative link. I find it sort of interesting that the official documentation uses both float and double to store Time.realtimeSinceStartup.
I don’t understand how the tables on that article are accurate. He quotes the table: Float Value;Time Value;Float Precision;Time Precision
1,000,000; ~11 days; 0.0625; 62.5 milliseconds
but it’s clearly stated all over that floats only have 7 digits of precision, meaning 1,000,000 cannot differentiate between 1,000,000.2 and 1,000,000.7, correct? How does that table give an extra 0.0625 precision to the number 1,000,000?
If you take the float max at its best (whole) precision: 3,402,823, you get 39.38 days (or should that be calculated with 9,999,999?). Doesn’t that mean that Time.realTimeSinceStartup will then only be stored to the precision of 10 seconds, and eventually 100 seconds, and so on?
I’m building a master server list that currently uses Time.realTimeSinceStartup to judge when each server last updated its information. I think I will have to move to interpreting System.Time or the server will stop working properly after roughly 39 days. Correct?
That is what I’m doing actually. When a server sends its updated info to the master server, every 15 seconds, the master server sets it’s timeLastChecked to the current Time.realTimeSinceStartup. Then, rather than adding to that every frame the master server only looks at the list every 30 seconds to see if there are any servers who’s timeLastChecked is more than Time.realTimeSinceStartup + 40, and removes them if there are. I can work around relying on Time.realTimeSinceStartup, I was just curious of the validity of the above discussion and the linked article as, as was pointed out, any sort of a float based timer used in a 24/7 server environment will probably lose the ability to track seconds after 4-5 weeks… I think?