Time.deltaTime VS Screen Resolution

I’m using Unity 2019.4.0f1 (and will continue too until I’ve finished current game)
and noticed that there are varying speed/velocity results, when using transform.translate(some vector * Time.deltaTime) . The results differ between different screen resolutions.

I first noticed this when I put out an alpha build of my game. Somebody playing it on 4K TV (3840x2160) I target 1920by1080 by default. Anyway, everything in the game appears to move at expected speeds with the exception of two different objects in game using said code line above, when tested on different screen resolutions.

I noticed of my target res and anything higher, the speed is as expected or even faster. Anything lower, for example 800x600 or 1024x768 the speeds are slower.

Is this to be expected? It’s unusual and it was to my understanding anything multiplied by Time.deltaTime should move at same speeds on all systems, ie have value that is const?

I found these that seem to reveal this is a thing.

Time.deltaTime Not Constant: VSync CameraFollow and Jitter page-9

I suppose I’d have to update Unity. Has nobody dealt with this issue on their project? Did you test against different machines, monitors, refresh rates, screen resolutions and get expected results using Time.deltaTime.

I’m having the exact same issue right now. From what I can tell, Time.deltaTime measures the time in between 2 frames and when you increase the resolution your frame rate decreases which is causing more time in between 2 frames and when multiplied with speed, it is becoming faster. Same thing with decreasing resolution but instead of dropping frames, you are gaining them so the value Time.deltaTime decreases and speed is multiplied by a decimal that is followed by more zeros resulting in a slower speed. As of right now I have not found a solution to this but I am looking for one. I will reply again once I have found it.

If you post your code, we might be able to help. Multiplying position changes by delta time should make them frame rate independent, unless you’re doing it wrong somehow. But it’s hard to tell without seeing the code.

im having the same issue. how do i get around? what do i do? as for example, using Time.deltatime in mouselook script will cause that issue, if on fullscreen (in editor) it runs slower, if on tiny game window it runs faster. how does that work? same thing for multiplying speed over time. But in that case i can just not use time.deltaTime which works fine, althrough fast but consistent.(U2019)

Same as the Unity dev said a year ago, without code examples no one can help.

These questions should go in scripting anyway: https://forum.unity.com/forums/scripting.12/

            mouse.x = Input.GetAxis("Mouse X") * sensistivity * Time.deltaTime;
            mouse.y = Input.GetAxis("Mouse Y") * sensistivity * Time.deltaTime;

            xRotation -= mouse.y;
            xRotation = Mathf.Clamp(xRotation, -90f, 90f);

            transform.localRotation = Quaternion.Euler(xRotation, transform.localRotation.y, transform.localRotation.z);
            playerBody.transform.Rotate(Vector3.up * mouse.x);

On LateUpdate. But using Time.fixedDeltaTime fixes that difference issue

You shouldn’t be multiplying mouse input by delta time. It’s already measured in length, not velocity so multiplying it by time gives you a garbage result.

1 Like