I am trying to do network time synchronization across clients, but I noticed some weird behavior and setup a simple project to test Time.deltaTime as it counted up. All I do is have a variable in the project that starts at 0 and holds the milliseconds since the program started. I display this on the screen and then open the program in 2 separate windows. I then compare the difference in the time displayed, and instead of the difference staying constant, it fluctuates quite substantially.
float counter = 0.0f;
void Update() {
counter += Time.deltaTime * 1000;
}
Unless I am understanding this incorrectly, no matter the framerate or when the players were opened, the difference between the value for counter should remain constant. Am I approaching this incorrectly, or is this a possible bug?
duck
2
Yes, accumulating deltaTime into your own variable will always be less accurate than measuring Time.time directly.
Simply use Time.time instead of your counter variable and you should get much better results.