Got a new computer and an additional monitor, now Screen.resolutions returns the same refreshRate for all resolutions

Hey there, I got a new computer 2 weeks ago, and I just notice something annoying about Screen.resolutions.

Before, I had 2 monitors, and I can’t be sure if I ever tested on both, but at least, on the one I used to work on, I had multiple times the same resolution (width and height) for different refresh rates (like 1650 x 1080 60 Hz, 1650 x 1080 75 Hz, 1650 x 1080 144 Hz and so on), all the refresh rates my monitor supported.

Now, I have a 3rd monitor and a completely new computer, and I get only one instance of each resolutions (width and height) and they all have the refresh rate that the “main monitor” is set at (170 Hz in my case).

How can I fix this ?

On top of meaning the player can’t choose the resolution they’d want, it also means that trying to find the index for Screen.currentResolution among the Screen.resolutions array will cause it to be -1 in case the refresh rate of that monitor is not the same as the main one (in my case 1650 x 1080 144Hz won’t have a match because all the array contains for 1650 x 1080 is 170 Hz).

Thanks for the help.

Just to make it clearer, I thought of adding screenshots:

  • This is the main monitor settings (you can see multiple refresh rates with 170Hz being selected):

  • This is the monitor I launch play mode on (the monitor in which the editor is):

This is the code to show all the resolutions:

resolutions = Screen.resolutions;

foreach (Resolution resolution in resolutions) {
  print("width: " + resolution.width + " / height: " + resolution.height + " / rr: " + resolution.refreshRate);
}

And finally, the prints:

As you can see, it contains all the width / height of the monitor I’m playing on (so, NOT the main monitor), but the refresh rate (rr) is always 170Hz (the one set on the main monitor).
The red circle shows the current screen I’m playing on (Screen.currentResolution.ToString()) which is indeed correct.

Why does that happen ? Why does it always use the refresh rate that is selected on the main monitor ? Is that a bug in Unity ? Why would it work on my previous computer and not this one ? Is there some settings I need to do somewhere in the editor ?

I also noticed that, in a game made in Unity (a finished game), in the settings menu, on the resolution settings, all refresh rates are indeed available for the monitor I play on. Though the version of Unity that was used is not the same as mine (2017 something against 2021.3.20f1 for me.

Please help, I couldn’t find anything on the matter anywhere :confused:

PS: Of course, if I set the other monitor (the one with a max of 144Hz) as the main monitor, then all the prints show rr: 144Hz. So clearly, some settings must be using the main monitor selected refresh rate by default (and for some reason, only that one value, not the other supported ones)