Hi everyone!!! I was just wanting to get a deeper understanding of the differences between the two. I’ve seen people talk about fixing the problem of people’s machines running at different speeds. I’ve seen some people recommend using fixed update, and some recommend time.deltaTime.
Is there a relevant difference between the two? And, how would you implement tim.deltaTime into something like:
if (Input.GetKey(KeyCode.W)) transform.Translate(0,0,.01f);
or transform.localEulerAngles = new Vector3(0,0,zRotation);
??
How would you recommend implementing time.deltaTime into code like that??
Remember that time jumps forward in discrete steps (called frames). You might go from, say, 1.05 seconds to 1.07 seconds without the time ever having been exactly equal to 1.06 seconds (let alone values like 1.05962714).
The exact time between frames varies unpredictably. You can move things around based on the amount of time that actually passed, and that will give you the same long-term average speed (where “long-term” in this case means “entire seconds have passed”), which is fine if all you care about is that your object moves from position 10 to position 20 over the course of 3 seconds.
But sometimes you really need every individual step to be exactly the same; you care whether the object was momentarily at position 14.597 or momentarily at position 14.592 (sometime along the path from 10 to 20). In that case, you should probably use FixedUpdate.
Sometimes FixedUpdate is also just slightly more convenient because you don’t need to do as much time-related math. And there’s a few other technical details, like FixedUpdate happens at a different time in the main loop. On the other hand, FixedUpdate is often inconvenient for user input, because the Unity functions that detect when a key is pressed/released (rather than whether it is being held down) are synchronized to Update, not FixedUpdate.
Time.deltaTime is the time between frames. So if you’re averaging 50 fps, that’s 0.02 or 20 msec. If you multiply motion or rotation by Time.deltaTime (and compensate by multiplying by 50), then at 50 fps, there’s no change. All good so far.
If your fps goes down, say to 25, that’s 0.04 or 40 msec (note: you’re still multiplying by 50, as that’s the correction for your average/optimal frame rate). So in that step, you’ll effectively “miss a frame” relative to the average, but you’ll move/rotate 2X as far…therefore, your movement/rotation won’t be slowed by the drop in fps. Instead, it will just look jumpy for a split second. Note that you can also set Application.TargetFrameRate, though of course there are limitations.
Usually this is preferable to suddenly moving much more slowly in real time units, because a player is expecting to go Q distance in G amount of time, as they have been while the fps is high. (I was getting bored of using A B C and X Y Z all the time.) The opposite would happen if your fps goes up: more frames in the same amount of time, with less distance/rotation happening in each frame.
Multiplying by Time.deltaTime a good way to manage occasional drops in fps. Of course, if your fps on a target platform is low all the time, you’ll need to optimize in other ways.
That makes sense, thank you for those explanations. So just to make sure I’m correct about this. When you say multiply by Time.deltaTime it would look something like this:
Way, way back I remember people suggesting FixedUpdate for the “fixed” part. Someone might hand-tweak Update movement code to where x+=0.07f gave a nice speed. But then it mysteriously started going slower. Arrg! Even worse, it went even slower in builds. The problem was how Update() can happen at different rates. The “fix” was to use FixedUpdate. You still have to experiment; maybe x+=0.082f is a good speed now; but then you were set forever.
In other words, moving stuff in FixedUpdate was a decent solution for someone who wanted to deal with the fewest numbers and math and weird explanations as possible.
The deltaTime thing fools a lot of people who didn’t enjoy High School algebra. Suppose you want to gain M manna per second. If you know there are 50 updates per second, you could use manna+=M/50;. Easy, right? Each frame adds 1/50th, a whole second adds 50*M/50 = M manna.
A little more complicated: 1/50th is 0.02. So we could use manna+=M*0.02f; instead. Seems weird, except 0.02 is how many seconds a frame takes, so it’s like giving 0.02 seconds worth of manna. Even more complicated, suppose we don’t know the frame rate, or it’s different on Android vs. PC. We’ll get Unity to tell us timePerFrame. Our new code is manna+=M*timePerFrame;. But Unity doesn’t do that since frames don’t always take the same time. And that’s where time.deltaTime comes in.
If a frame happens to take 0.03 seconds, then manna+=M*Time.deltaTime; gives us 0.03 seconds worth of manna this frame. The math isn’t difficult, but there are a few steps.