How to use vSyncCount to get a desired frame rate?

QualitySettings.vSyncCount will set how many vertical syncs will correspond to one Unity update. That's great. So how do I use this to get an update rate close to 60 fps, given wide variation in monitor refresh rates?

I thought I could use Screen.mainWindowDisplayInfo.refreshRate.value to get the correct refresh rate (i.e., what fps I would get with vSyncCount == 1). But no; in actual testing, I have at least one user (on Windows) where this value correctly displays 75 for one monitor and 165 for the other, as he moves the Unity window from one screen to the other... yet the fps I get from vSyncCount = 1 is always 54. I never see 165 frames per second if I use vSyncCount at all. And if I set vSyncCount according to the reported refresh rate (165), then I set it much too high and my resulting fps is much too low.

So that leaves just trying to set it "experimentally" I guess — measuring the actual Time.unscaledDeltaTime over some frames, perhaps when vSyncCount = 1, and going from there. But I tried that, and it's surprisingly tricky to get right; I have to take care not to measure right before or after changing the vSyncCount, or my estimate will be wildly off.

All this feels like working way harder than Unity usually requires us to work. Is there some simple way to discover, without measuring, the frame rate that vSyncCount=1 will currently give me, for wherever the Unity window happens to be?

And why the heck would I be getting 54 fps when vSync=1, on a system where the monitors refresh at 75 and 165 Hz?

You're correct that Screen.mainWindowDisplayInfo.refreshRate should give you the expected maximum frame rate when vSyncCount is set to 1. However, it doesn't always work that way:

  1. If the machine isn't powerful enough to reach that frame rate, it will be whatever the machine can reach. I've also seen this happen when the monitor is plugged into the wrong GPU (one on the motherboard) on a desktop machine;
  2. If there are any software frame rate limiters on the system, they too can reduce actual frame rate (that includes any power management features on devices like laptops);
  3. If VSync is forced disabled in the driver, setting vSyncCount to 1 will make the engine think VSync is enabled but it will not be tied to the refresh rate of the display and may run faster than the maximum refresh rate.
  4. Variable refresh rate displays can result in uneven frames and that is actually how they're designed to work (and look best that way).

It's probably either the 1. or 2. that I described above.

There is really no good way to do this. Some machines will just not be able to pull it off. Yes, you could try manipulating it with different vSyncCount values but that comes at a cost (increased input latency for higher values than 1) and the fact that it's generally limited to the range of 1-4 means that if the monitor refresh rate falls outside those boundaries (for instance, a 30 Hz monitor or 360 Hz monitor), you would have to disable vSync to force that frame rate, which would result in tearing and in case of a lower refresh rate monitor, not every frame visible on the display. Ideally, you should expose vSyncCount as a setting in your game so that the person playing it could choose what's best for their situation (perhaps they want to play at lower frame rate to preserve battery, or perhaps they are super sensitive to motion sickness and need 120 fps to be able to play).

I would definitely not try doing that. There's a lot of things can go wrong: maybe you're measuring it during a temporary time when the machine is busier than usual, causing the frame rate to drop. Maybe the player is still tweaking his settings during your measurement (such as which display the game is on) and your measurement will become stale soon. Getting it wrong might mean some part of your player base might be unable to enjoy the game at all.

Let me ask you this: what prompted you trying to achieve this?

1 Like

I want to add a third plausible reason:

  1. It's the way the framerate is measured. Typically an average over multiple frames is taken and displayed, otherwise the fps number would fluctuate too fast for our eyes to read them.

If you get 10 times 60 Hz then it'll be 60 fps displayed. If instead over that time period the measured fps are (8*60+2*30)/10 then the displayed FPS will be 54 fps, at least for that particular one tenth of a second, despite vsync being enabled.

OK, thanks guys. I think I'm probably going to give up on vsyncing, and just set targetFrameRate to 60.

Since you wondered: this is for Mini Micro, which is a programming environment aimed in part at beginners. I don't want them to have to deal with a variable frame rate; I want them to be able to write update loops that look like player.x += 2 and have it work very close to exactly the same on anybody's machine.

Mini Micro already includes some action games (Platform Demo, Flappy Bat, Mochi Bounce, etc.), and we've also made some additional animation programs specifically to try to cause visible tearing. So we're going to see how much a fixed 60fps update loop causes a problem in practice. But if it's acceptable, then we'll go with it, because trying to use vsync and still get close to 60 fps just looks too risky.

targetFrameRate is a suggestion at best too. There are several scenarios where it will get ignored.

Why don't you implement some kind of interpolation mechanism instead, rather than passing the value directly into the engine/transform component? That's how Rigidbodies are able to work on any kind of frame rate despite the fact that they update at fixed time step.

1 Like