I am working on a script to control the targetFrameRate in one of my games. Initially I was trying to solve the problem of dynamically choosing the optimal frame rate across multiple platforms. To do this I have the notion of a controller that eventually settles to the lowest valid frame rate above a pre-determined minimum. The idea being to reduce power consumption but not go lower than some pre-determined minimum required by the game.
Now I find myself asking some questions. Am I re-inventing the wheel? If so, what is the wheel I should use? If not what is a good approach to control the frame rate? My approach is to set the targetFrameRate to a value, and move up or down based on how well the current frame rate is holding the set point. Is this a reasonable approach? Can you think of a better way to do it?
There is none. You can’t really control the framerate. It’s like chasing a carrot.
Unlike with networking you can’t even predict where that carrot will be in the next frame because there’s no rhyme or reason to it thanks to all those background processes. The better solution is to lock all players to a minimum fps (eg 30) and design the game around this.
Or run a performance test and auto-pick the optimal quality level.
You can also try to adjust the quality based on the current framerate dropping below a threshold, typically consecutively over a number of frames. This is what some games do: switch to a lower quality setting on the fly. Like simply lowering draw and LOD distances. This is not a guarantee for retaining a consistent fps however.
Then the game periodically checks if the quality could be increased again but here’s the tricky part: how are you going to determine that without rendering a frame? Still, some games apparently manage to achieve this, at least supposedly.
Possibly they’re managing an internal list of things the machine can render sufficiently fast and proactively render anything above that threshold at lower LODs for instance. So if there’s way more enemies visible than usual more of them will be of lower quality (or perhaps trading for background fidelity).
It’s also probable that the quality setting remains lowered for the remainder of the given game level, or section of play. Any ingame event that removes player control shortly could be used to test if a higher quality is possible again. But you don’t want to test higher quality every other second because this would annoy the hell out of players.
I wasted a lot of time trying to optimise and automate this. In the end, I gave up, and just added a setting that lets people drop the framerate if they have issues with the default (which for my simple mobile game is 60fps).
edit –
By the way, if you’re writing for android, I wasted a bunch more time before I worked out that I had to turn off Project Settings → Player → Android → Resolution and Presentation → Optimized Frame Pacing. I don’t remember exactly why, but this setting made my frame rates misbehave badly on devices.
Interesting discussion. Seems there are two ways to affect Time.deltaTime. 1) Fix the amount of work done per frame and adjust Application.targetFrameRate until Time.deltaTime becomes stable. You may find different values of Application.targetFrameRate on different platforms, or this may vary during different scenes. This is easy. 2) Fix Application.targetFrameRate and adjust the work load done per frame until time.deltaTime ends up where you want it. This is advanced.
I was asking about #1, but is #1 even useful? Is it a better user experience to use a lower more stable Application.targetFrameRate than to oscillate between two values (assuming vsync). I guess that was my assumption but I don’t remember why I thought oscillating would be bad in the first place now.
Assuming vsync is on, a stable 30 fps will feel smoother than fluctuating between 30 and 60 fps because then the small stutter is comparably more noticeable if you get to see really smooth 60 fps every now and then.
With vsync off this doesn’t matter nearly as much though, unless you hate tearing (I do) and don’t use a variable refresh rate (gsync) monitor.
Based on this discussion here is the frame rate limiter script that I made for my game. It is supposed to solve the problem of automatically choosing a stable frame rate for a game across varying platforms for fixed and oscillating loads. Try it out. What do you think?