My game is setup to use Vsync “Every V Blank”. And when the game is running normally, this is working correctly and the framerate sits right around 60 fps. However, when I minimize the game its framerate shoots way up and my GPU usage shoots to 100%.
Application.targetFrameRate is ignored (as documented) because vsync is active so this is no help.
For reference, I’m running Unity 5.2.2 but I see no relevant items in changelogs since then.
Has anyone else noticed this? Are there any good solutions/workarounds? And/or is Unity aware and working on a fix? It’s really not ok for my minimized application to be using our players’ entire GPU but right now I’m not seeing any way to solve this.
Thanks in advance for any suggestions!
You can at any time manually force a slower framerate. Just call a Thread.Sleep() with a small time whenever Time.deltaTime goes below a certain threshold.
void Update()
{
if (Time.deltaTime < 1f / 70f)
{
int ms = Mathf.FloorToInt((1f/60f - Time.deltaTime)*1000);
System.Threading.Thread.Sleep(ms);
}
}
Thread.Sleep will reduce your CPU load. Thread.Sleep is not very accurate as it depends on the CPU load. I just calculate the remaining milli seconds to the target framerate of “60”. I round down that number since in most cases Thread.Sleep will take a bit longer than the given rate.
Doing this will make your frame rate very jerky, so make sure you set the threshold “low enough”. In the example i used a rate of 70 fps.
ps: The reason for this problem is simple: Since the graphic context is no longer visible, it isn’t redrawn at all. So waiting for the vsync doesn’t work.