timer+=Time.deltaTime; I’ve always used everywhere. for simple timers of 1-60sec.
I’m aware of floating point issues with extensive use, but I recently found that doing this will get you an inaccuracy of .2 to .4 seconds for a 60 second period?!
I.e. Unity says 60 seconds have passed, in reality: 60.3 seconds have passed.
((version:2018.2.5f1))
Simple Update code to reproduce:
float t = 0;
double tt = 0; //more accurate bucket
System.Diagnostics.Stopwatch SW;
void Update() {
if (t == 0) {
QualitySettings.vSyncCount = 0;
Application.targetFrameRate = 60; //20 =.2 error, 200 =~.4s error
SW = new System.Diagnostics.Stopwatch();
SW.Start();
t = .00000001f; //negligible, don't count deltatime this frame, cause that was previous time.
} else {
t += Time.deltaTime;
tt += (double)Time.deltaTime;
}
if (t>60 && SW.IsRunning) {
SW.Stop();
Debug.Log("Float t="+t);
Debug.Log("Double tt=" + tt);
Debug.Log("StopWatch =" + SW.ElapsedMilliseconds/1000f);
}
}
This seems like a huge error for just blaming floating point accuracy.