Time class floating point accuracy

I was just wondering if someone could clarify how the Unity timer class addresses floating point accuracy deterioration - and hopefully confirm that I don’t need to worry about it :slight_smile:

The docs suggest that critical fields such as Time.time and Time.realtimeSinceStartup are simply floats, but if so, that means that their accuracy will degrade over time (as explained here). Hopefully these value are internally maintained as doubles that get converted to single precision upon access?

You are correct. Natively, these values are doubles, they are just exposed as floats.

Thanks for the reply - good to know. …Although now that I think about it some more, I can see how this approach gives us consistently reliable deltas, but I’m still not sure I understand how Time.time and Time.realtimeSinceStartup aren’t going to become inaccurate since the form we access is just a single precision float, i.e. we lost the double precision at point of conversion. Or is there some magic I’m not seeing?

Thanks again!

trepan: I’m pretty sure you’re aware of this, but just to clarify for any future readers of this thread:

The answer to the last question is that we lose the double precision at conversion to float. So be particularly wary with long-running instances (attract demos, game servers, mission critical software) that have certain time precision demands. Use any combination of Time.deltatime, Time.timeSinceLevelLoad and your own high-precision timer for more correct results.

I had a problem with this in a shader using time as a parameter to generate simplex noise (to drive a “twinkle” effect for surface lights when seen from orbit). If the game was left running for a few hours, the precision degraded to such a degree that effects would “stutter”. This isn’t because the timer lost its place, but rather that after conversion to a float, the representable values were too far apart to appear smooth.

In the end, I cheated and kept my own count, resetting it to 0 every hour or so. There was a flicker when I reset, but it was so infrequent (and the shader so sparsely used) that it wasn’t an issue.

The UNET source code has various places where absolute time since startup is stored as a float. e.g. NetworkBehaviour.m_LastSendTime and NetworkTransform.m_LastClientSendTime. This will cause problems for long-running games.

1 Like

The problem still exists in 2017.1. Time.realTimeSinceStartup should be exposed as double as well. Even in day 3, long running server loses time precision, unless you use custom timing like I do: use real Mono timers (like System.diagnostics.Stopwatch) and custom time classes

2 Likes

It would be very useful if there was a double version exposed. And as it’s stored that way internally, is there anything preventing that?

1 Like

Especially if used for scientific purposes where accurate timekeeping is essential, it would be great to have a double precision Time.time exposed.

I would guess that DateTime.Now.Ticks would be a better choice in that use case, which should be accurate to the level of the OS timer and hardware implementation allow. It is also a long instead of a floating point variable, so doesn’t degrade in accuracy as time passes.

Can someone explain to me why having a double helps? Aren’t both doubles and floats going to have potential precision issues in C#? They both should not be compared for equality due to the epsilon ± value potential between them.

I thought that was why decimal was used instead for values that must have guaranteed precision? What am I missing here?

Here’s a good post about it: Don’t Store That in a Float | Random ASCII – tech blog of Bruce Dawson

1 Like