Time.deltaTime Not Constant: VSync CameraFollow and Jitter

I’m running with VSync at a constant 60fps which is a calculated dt of 16.667ms (blank project with just a cube).

When I call Time.deltaTime it varies wildly. Time.deltaTime can be as low as 2ms and as high as 31ms. The deltaTime “averages” out to be ~16.67ms.

Why does Time.deltaTime vary wildly? Shouldn’t it be a constant 16.667ms assuming the Update() loop has minimal computations and lots of idle time?

My understanding is the Update loop runs as fast as possible (and will have variable Time.deltaTime) but if VSync is enabled then it should sync the Update rate to the display (60fps). Assuming the Update loop isn’t overloaded and it can complete in < 16.667ms then shouldn’t Time.deltaTime always be a constant 16.667ms?

The question relates to camera jitter (camera following the player). Because the Time.deltaTime varies so much it is making the camera jitter.

Thanks in advance.

2 Likes

In your camera moving code, multiply Time.deltaTime by the vector for one second of travel to get a scaled down movement vector that matches the current time step. That will eliminate your jitters even when the framerate varies.

1 Like

Thanks for the tip but I don’t think that helps. The issue is that my monitor updates update 60Hz (16.667ms) and my Update() loop gets called at 60Hz (16.667ms) on average… but some Time.deltaTimes are 1ms and some are over 30ms.

So if I multiply anything by Time.deltaTime the actual delta motion will vary wildly (because I am multiplying anywhere between 1ms and 30ms). This causes jitter. If I multiply by a constant 20ms then I will get smooth motion even though my Time.deltaTime varies wildly from it’s average of 20ms.

I’ve also noticed that the Time.deltaTime varies more in the editor than in the build. The build is a bit smoother.

But with a blank project and no graphics or code loading and Vysync of there should be lots of idle time and I would think Time.deltaTime would be very close to the ideal 16.667ms… why does it vary so much??

It’s worth emphasizing that my monitor framerate is not varying (it’s 60Hz) and my Update framerate is also always 60Hz if you take a 1sec average… it’s just the Time.deltaTime that varies drastically. This seems so strange.

2 Likes

How are you determining Time.deltaTime? If you’re using Debug.Log every frame, that causes lag. Better to collect data over a period of time and then output it all at once. I don’t get any wild variances here:

0.01665795
0.01681304
0.01697999
0.016348
0.016451
0.01687998
0.01671803
0.01662397
0.01652002
0.01681399
0.01668197
0.01667905
0.01664799
0.01576495
0.01757705
0.01605797
0.01727098
0.01665604
0.01665896
0.01667005
0.01631999
0.01700497
0.01698202
0.01636398
0.01667804
0.01665395
0.01652002
0.01682001
0.01666498
0.016675
0.01666099
0.01635301
0.01698601
0.016662
0.01660103
0.01657498

But yes it’s interesting that it varies at all.

–Eric

3 Likes

You need to assume that the frame rate is going to vary some even when you try to lock the frame rate. Make sure your code compensates the movements based on the deltaTime.

I don’t know for sure why the frame rate in Unity varies, but my hunch is that Unity’s code for locking the frame rate actually limits the frame rate to just below the desired frame rate instead of correctly locking it with the vsync. The way vertical sync works, the hardware will experience a brief 30 FPS frame rate if the game frame rate dips to 59 FPS, assuming the monitor vsync is 60 FPS.

By definition, vsync forces the hardware to sync the frame with the monitor. If the game is slightly slower than the monitor, the hardware will sync at the next opportunity, which leads to 60 FPS, 30 FPS, and 15 FPS. That is one reason I like to simply disable vsync.

When you get a 31ms frame, that is probably where Unity was running 59 FPS and the hardware dropped it back to 30 FPS. The 2ms frame was probably Unity’s attempt to compensate for the 31ms frame.

No, it happens even with an empty scene. Unity does run vsync correctly (it waits for the monitor sync, that’s what WaitForTargetFPS in the profiler is), and there are no dropped frames. In my case the variance is always within a 60Hz cycle, so I expect that the exact moment in which Time.deltaTime is measured each frame is not done at a set time for whatever reason.

–Eric

I know this thread is old, but I would like to know the answer. Why the deltaTime is not constant with vsync? Even though the fluctuations may look small, human brain is very good at seeing if the movement is smooth or not, so even something like a 2ms difference is noticeable.

Is there anyone who knows? Or maybe someone could ask someone from Unity. I think deltaTime is probably the most often used variable, so we should know what’s going on.

I can’t explain the 2ms, but I can possibly help you with the camera jitter, assuming you’ve make some kind of blending camera:

https://twitter.com/davidhampson/status/823652498334420993

I appreciate the response but I’ll admit I don’t understand that code. What is the intent?

float scale = 1.0f - (float)System.Math.Pow(0.95, Time.deltaTime * 60.0f); // Framerate independent code, good!

For context my game is set up top-down w/ a camera locked to the rigidbody networked player.

So getting things to appear smooth has been very difficult. I have 4 interacting deltaTimes.
1: FixedUpdate: 0.02s (50Hz) constant rigidbody physics engine
2: Update: Varies between 0.002s and 0.031s. Average of 0.0167s (60Hz)
3: Monitor: framerate: Constant 0.0167s (60Hz)
4: NetworkSendInterval. Lets say 0.10s (10Hz) average with some variation.

So there are several issues that cause jitter.
Jitter Cause #1: Rigidbodies: Because FixedUpdate is does not match Update you will get jitter. Using 50Hz FixedUpdate and 60Hz Update as an example you will get jitter sequences of 6 frames. In 5 of 6 frames the rigibody moves faster than actual, and then 1 of 6 frames where there is no motion.
Frame1: Movement at 60/50 (120% speed)
Frame2: Movement at 60/50 (120% speed)
Frame3: Movement at 60/50 (120% speed)
Frame4: Movement at 60/50 (120% speed)
Frame5: Movement at 60/50 (120% speed)
Frame6: Movement at 0 (0%) speed
Frame7: Movement at 60/50 (120% speed)
Frame8: Movement at 60/50 (120% speed)
Frame9: Movement at 60/50 (120% speed)
Frame10: Movement at 60/50 (120% speed)
Frame11: Movement at 60/50 (120% speed)
Frame12: Movement at 0 (0%) speed

Jitter Cause #2: NonRigidbodies: Anything that is not a rigidbody that I move manually (transform.positon += velocity*time.deltaTime) will be subject to variations in time.deltaTime. Even though monitor framerate is exactly 60Hz (16.67ms) and my code is not overloaded (no dropped frames) my time.deltaTime varies wildly between 2ms and 31ms (even though the average is 16.67ms). This causes very significant jitter. This is the original post in this thread. I don’t understand the cause of the time.deltaTime variance.

Jitter Cause #3: Non-Owned Networked Objects: If I snap the position of the object to what was received in the network packet then you will see jitter. So care has to be taken how to apply the position data (i.e. LERP, add delay, etc.).

To address issue #1 I detach the mesh & renderer from the rigibody and lag it slightly (by up to 1 frame) and interpolate. This seems to work really well, and adds virtually no delay like a smoothing filter would. The camera is then attached to the mesh & renderer (not the rigidbody). See in the video how things move smooth even though the camera is attached to a rigibody with a FixedUpdate rate.

To address issue #3 I instantly snap the rigidbody to the data received, but again I detach the meshRenderer from the rigibody and LERP it (like a 1st order filter). This adds noticeable lag/delay but I think is acceptable. I send commanded position/angle/velocity (in addition to current position/velocity) across the network and perform physics locally in-between network updates which seems to help a lot.

To address issue #2 I’m at a loss (hence this post). With a time.deltaTime varying from 2ms to 31ms using transform.positon += velocity*time.deltaTime generates jitter. So if I am moving an object manually every Update() loop then what deltaTime do I multiply by? That’s my hang up.

Thanks in advance.

… In that case the code isn’t relevant. I was referring to a damped camera.

Could that be part of the problem? You should attached to the camera to the gameobject transform, not the rigidbody. The rigidbody will move at physics steps, whereas the transform will be interpolated.

I don’t think this statement is true: many games use a FixedUpdate of 1/50. Unity should interpolate the transform for smooth gameplay. Try simplifying the case (e.g. a box, moved by rigidbody gravity) if you aren’t convinced Unity is ‘smooth’ out of the box.

Have you tried using the Profiler? Maybe you have significant garbage collection going on on the longer frames?

Makes sense. I’m not entirely sure how to combat this, but there might be some generic networking wisdom you could apply here, or maybe even some UNET tutorials? Sorry not to be of more help here.

Again, I think you might be reinventing the wheel here, because I think this is what Unity does by default.
If you try making a simple example with a cube acting due to physics gravity you should be able to convince yourself of this.

Yep this all sounds like logical netplay stuff.

You should always use Time.deltaTime, but the real question is why is it fluctuating? I suspect occasional big garbage collections. Try making a simple test in an empty scene, I think you should find it is more consistent.
If it does turn out to be Garbage Collection, try looking at https://unity3d.com/learn/tutorials/topics/performance-optimization/optimizing-garbage-collection-unity-games

1 Like

Ah it looks like you are right. If you turn on Rigidbody.interpolate it interpolates the 1/50 FixedUpdate to a smooth 1/60 Update to remove those jitters. I did reinvent the wheel…

Regarding time.deltaTime jitters try this script in a blank new project and see what min/max Time.deltaTime’s it records. For me most timesteps are close to the 16.67ms but periodically (let it run a few min) I get outliers (worst I’ve seen is 5ms and 34ms) which show up visually as intermittent jitter.

public class MeasureDeltaTime : MonoBehaviour
{
    private float deltaTimeRunningAve;
    private float filterRatio = 0.01f; //1st order filter
    private float deltaTimeMin;
    private float deltaTimeMax;
    private void Start()
    {
        Application.runInBackground = true;
        Invoke("ResetMinMax", 1f); //clear any startup initial outliers
    }
    private void Update()
    {
        //Debug.Log(transform.position.ToString("0.000"));
        float deltaTimeMS = Time.deltaTime * 1000f;
        deltaTimeRunningAve = (1f- filterRatio) * deltaTimeRunningAve + filterRatio * deltaTimeMS;
        deltaTimeMin = Mathf.Min(deltaTimeMin, deltaTimeMS);
        deltaTimeMax = Mathf.Max(deltaTimeMax, deltaTimeMS);
    }
    private void OnGUI()
    {
        GUI.Label(new Rect(10, 40, 200, 30), "DeltaTime: " + (Time.deltaTime * 1000f).ToString("0.00") + "ms");
        GUI.Label(new Rect(10, 70, 200, 30), "DeltaTimeAve: " + deltaTimeRunningAve.ToString("0.00") + "ms");
        GUI.Label(new Rect(10, 100, 200, 30), "DeltaTimeMin: " + deltaTimeMin.ToString("0.00") + "ms");
        GUI.Label(new Rect(10, 130, 200, 30), "DeltaTimeMax: " + deltaTimeMax.ToString("0.00") + "ms");
        if (GUI.Button(new Rect(10, 160, 200, 30), "Reset Min Max DeltaTime"))
        {
            ResetMinMax();
        }
    }
    private void ResetMinMax()
    {
        deltaTimeRunningAve = Time.deltaTime*1000f;
        deltaTimeMin = 555.555f;
        deltaTimeMax = -555.555f;
    }
}

Quick update, since I haven’t quite finished my investigations yet.

I’ve been writing a program to try and visualise these small ‘deltaTime’ variations, as well as the bigger spikes, to see if I can determine the cause.

The smaller variations, (between 16.5 and 17.0) are actually only about 2%, so won’t make a huge amount of difference. To confirm this I did a small test, one white bar moving at .deltaTime and one using a fixed 1/60 interval. They both moved about as smooth as each other, so I don’t think this is a huge concern.

The spikes though are a bit strange, I haven’t quite got to the bottom of them yet. I’m not sure if it’s GC, my laptop throttling power or something else. More news when I have it.

Are you getting spikes every now and then too? What’s your lowest/highest?

The big spikes are pretty rare for me. I get maybe 1 big spike every minute (by big I mean >4ms from the 16.67ms average) otherwise everything is close to 16.67ms. This leads to a little jitter every minute or so… not the end of the world, but would be nice to fix if you find out a way.

Thanks for looking into this!

I have investigated this some time ago and I have came to a conclusion, that a 2D square will always jitter, even if you will move it without any delta (like 10 pixel to the right every frame with vsync on). It will jitter because of the way screens are refreshed. The refresh is from the top to the bottom, line by line, from the left to the right. So the whole square will not change the position at the same time, it will move line by line. Sure the refreshing is really fast, but our eyes/brains can notice this as a lack of smooth movement/jitter/tearing, especially when you are paying attention to this. And it’s much more easy to notice this when you look at a 2D square on a solid background. If you will try to notice this in a complicated 3D environment, it will not be so noticeable (if at all).

The jitter is less noticeable when you move the square from the top to the bottom. It’s again due to the way the monitor refreshes. When you move the square from the left to the right, and looking in a very slow motion, you can notice that at some point half of your square has been already moved to the right and the other half still waits for the refresh process to finish. If you move from top to the bottom and the square has one color, the jitter is less noticeable because the same color will be refreshed over the same color and there will also be no situation when half of your square is already moved to the bottom and half is not (remember, the refreshing process is from the left to the right, line by line). The jitter will be only noticeable at the top and bottom edge of the square.

Anyway, let’s get back to the delta. Sure 2% is not a lot, but. You have to remember that most of the time you will move things with some speed, if it will be high enough, the variation will scale up. Our brain is really good at noticing errors in the smoothness.

If you object will move with the speed equal to 500 pixels per second and the delta variation is from 15.7ms to 17.5ms (as on the samples from one of the posts above), then your object will move:
0.0157 * 500 = 7.85 pixels per frame
0.0175 * 500 = 8.75 pixels per frame

(0.0175 - 0.0166) / 0.0166 * 100 = 5.42%

The fact that the square can be moved only by the whole pixels, makes things even more complicated. One pixel difference due to the delta variation
1 / 7 * 100 = 14.29%

I think something like that may be noticeable, even if only sometimes. We should try to avoid this if possible. On top of that I just want to know what is going on under the hood. We are using deltaTime all the time so it’s worth knowing what is going on with it and why. I hope Dave will provide us some answers. Thank you for your help!

Just turn on vsync, and you prevent anything like that from being able to happen. (Since any changes to the screen happen between refreshes in that case.) Also, monitors don’t really work like that any more; you’re thinking of old CRTs.

–Eric

Like you have quoted, I have said in my post that I have been testing with vsync ON. I am not saying here about the famous vsync off tearing effect, but about the subtle tearing visible even with vsync on. And the cause of the tearing I am talking about is the refreshing.

If the LCD is not refreshing the screen line by line like you can see on this movie:
https://www.youtube.com/watch?v=nCHgmCxGEzY

then please tell me how is LCD refreshing.
Also explain my observation of jitter with vsync on and moving the square 10 pixels per frame.

There’s no tearing effect with vsync on, and I don’t get any jitter either. Different LCDs refresh in different ways; for example some only refresh part of the screen that changed.

–Eric

I’ve never seen this myself, but even if so, this isn’t going to be specifically a Unity problem is it? There’s nothing an engine could potentially do about it.

I think there must be something wrong with the maths there, because 500 pixels per second at 60hz is 8.333 pixels per frame. The variation of 2% would be 0.1667 pixels.

The test program I wrote was designed to check whether this error of +/-2% would be noticeable in a game. At the moment it doesn’t look like it will be. Not that there isn’t technically a problem here, but I don’t think it’s too noticeable. As I say, I’m more worried about the occasional larger time glitches at this point. I’ve found that for some reason Unity takes longer in the WaitForTargetFPS call than the 16.6667ms it should do. I haven’t quite figured out why yet. It could be something to do with power saving on my laptop, so I will give it a try on another machine later.

I suspect the actual delta time variance increase quite a bit depending on the amount of threaded features you use in Unity, that is there should be more spikes. It would be absurd if there wasn’t, even if workload is reasonably constant.

In the old days, one would rightfully expect there to be minimal difference. We have much longer pipelines now in engines, and rendering probably takes place much later, after a heck of a lot of noise and multi threaded code…

I guess triple buffering was one solution nobody uses any more. It’s kind of a pain though, as I’m noticing console games at 30fps tend to be a fair bit smoother than Unity games at 30fps. I wonder what strategies exist beyond motion blur to smooth this out?

All testing was done with a blank new project, and blank scene (with just a cube). No scripts (other than the one that measures time.deltaTime). I’m using Windows 10 which may have unknown background processes.

The computer is pretty new. i7 4GHz, 16gb ram, GeForce 1070, only SSD (no disk drive)

I do notice that if I let Unity “run in background” and then minimize/expand applications this does lead to spikes. But all the spikes reported above occur without losing Unity application focus.

I agree. The small variations are not noticeable from what I’ve seen. Only the large spikes.