Actual refresh rate?

My Application.targetFrameRate is -1 and QualitySettings.vSyncCount is 0. But, Screen.currentResolution.refreshRate is 60. Is it mean that my game will run in 60 fps or unlimited?

As someone who never specifically touched these settings in Unity, generally:
Your target frame rate is the (limiter for the) framerate of your game, in frames per second.
Your screen refresh rate is how often your monitor can display images, in Hz.
Vsync is a technology to sofware-sidedly limit your framerate to the refreshrate of your screen.

So with the given settings your game might run with 300 FPS, but your monitor can only display 60 of those.

1 Like

Those Screen.currentResolution.refreshRate adjust only monitor refresh rate? Ok, now I think, I shouldnā€™t change refresh rate through the Screen.SetResolution.

You cant. Itā€™s read only. See the documentation for reference:

The only way for you to change the resolution and/or refresh rate would be through the SetResolution function. But even then you cant freely edit it, and are basically limited to the supported resolutions and refresh rates of the monitor, which can be read through the .resolutions property. You can read up on how SetResolution handles passing different values in the documentation aswell, if you are interrested.

Generally speaking you are right tho, you probably dont want to (programmatically) edit that value. The above mechanisms serve little more purpose than to enable you to display a list of available options to the player, who can then select a resolution + refresh rate they want to use.

Is there any specific reason why you asked this question, or did you just need a clarification on the terms involved? :slight_smile:

The actual refreshrate of your game depends on a couple of factors. First and foremost how much the load of your game is. If a frame takes 100ms to complete you will at most get 10 fps. You canā€™t get it any faster in that case. In addition there are other things that can limit the refresh rate. If v-sync is enabled the actual update of the screen is synced with the monitor refresh rate. That does not mean it always renders at the update frequency of the monitor. It just means before the current frame is shown the system waits for the ā€œv-blankā€ of the monitor. So when the monitor is about to render the next frame, Unity switches the front and back buffers so the next frame can be displayed.

The point of v-sync is / was to avoid screen tearing. So the image is not updated in the middle of an update cycle of the monitor. When v-sync is on you will never get a frame rate larger than the monitor update rate. In the optimal case you get one frame per monitor update. Though if your game load is too much, for example if a frame took 33ms on average to complete, you would only get a frame rate of 30 fps. So only every second v-sync a new image would be shown.

1 Like