I am looking for information on how to auto-detect an initial Quality Setting for our game.
In the Unity Bootcamp there is a GameQualitySettings script which chooses the Initial quality setting based on SystemInfo.graphicsShaderLevel. If that is unknown it will base it on the number of cpus and vram instead.
Another way I thought about doing it is determining the FPS on the loading screen and base the initial quality setting on that.
I am not sure if either of these two ways or a combination of them would be sufficient to choose an appropriate quality setting.
That would work fine in the game, dynamically cutting back eye candy when the fps drops. However a loading screen is usually quite bare and wouldn’t give you a realistic indication.
I remember seeing a similar script based upon fps. Can’t remember where though, perhaps the 3rd person platform tutorial?
I agree. For this I want a screen that increases poly count over a period of time so that I can determine how much the computer can handle by looking at the FPS.
To do this, I am thinking of creating Spheres on update calls which would increase my tris and verts and bring down my FPS on each update call. By looking at the FPS as well as some hardware specs, I could then determine an appropriate initial quality setting.
Is there any suggestions for how this could be done more effectively?