Framerate, Time.deltaTime and moving the character.

Hello guys, I’m making a character controller my own, and its going pretty well, the only problem that I have, is that i noticed that my character is moving faster when I have a low fps, this leads to him jumping higher than expected when the framerate is low.

Basically what i’m doing for movement is this:

When my character jumps = finalmovement.y = 6;
When my character is on the air: finalmovement.y -= gravity * Time.deltaTime;

Then, these values are used here:
myTransform.position = Vector3.Lerp(myTransform.position,(myTransform.position + (myTransform.rotation * finalmovement)),Time.deltaTime);

This is called in Update function.

When jumping, the finalmovement.y goes automatically to 6, and then decreases with the gravity, but in the editor, where the FPS is slower than the build, my character ends up jumping hihger, i suppose it happens because the gravity is decreasing slower too.

I tought that using Time.deltaTime would make it Framerate independent, but isn’t Time.deltaTime the time between the frames? So then, how does it make something frame independent with this?

You are correct in that it will be incorrect if gravity is also changing over time.
You are also correct in that using Time.deltaTime will make it framerate independent.

i suppose it happens because the gravity is decreasing slower too.

But why would gravity be “decreasing slower”? Unless you are doing a game which models gravity a little more realistically (in that, the further away you get from the planet, the less gravity you’ll get from the body).

Maybe you just meant the acceleration that gravity applies to your object, which is a different thing and is a bit confusing naming convention for such a variable.

but isn’t Time.deltaTime the time between the frames? So then, how does it make something frame independent with this?

Yes, it’s the time elapsed since the last frame. Consider that you are moving along a straight line in a constant velocity.

``````transform.Translate(0, 0, 5 * Time.deltaTime);
``````

In the above example, we are moving 5 meters (units) per second in the forward (Z) direction. If your game is running fast, like 60 frames per second, Time.deltaTime will be 1/60 (0.0167) seconds. If your game is running slow, like 10 frames per second, Time.deltaTime will be 1/10 (0.1) seconds.

• 5m * (1/60) = 0.0833 m/frame
• 5m * (1/10) = 0.5 m/frame

As you see, the slower your game runs, the further you have to move the object per frame. While it may still be a bit hard to grasp, let’s try to prove it and drive the nail into the coffin to look at what happens after 2 seconds.

• 60 fps: 120 frames, each incrementing 0.0833 m/frame: 0.0833 m/frame * 120 frames = 10 m
• 10 fps: 20 frames, each incrementing 0.5 m/frame: 0.5 m/frame * 20 frames = 10 m

See, regardless if we run at 60 or 10 frames per second, after 2 seconds, we will have moved 10 meters.

• Note: If you get annoyed at it arriving at 9.996 meters rather than 10, it’s because you followed my approximation of 1/60 which I did to cut some space in the discussion. Try (1/60) * 5 * 120 and you’ll see its a little more exact.