Set game quality level depending on device's capabilities (by code)

I came across this post on StackOverFlow when i was trying to make a script to detect device’s capabilities and set Quality settings and other settings accordingly.

I’m using this script to determine Quality level

public static void AutoChooseQualityLevel()
    {
        var shaderLevel = SystemInfo.graphicsShaderLevel;
        var vRam = SystemInfo.graphicsMemorySize;
        var cpuCount = SystemInfo.processorCount;
        var fillrate = 0;
        if (shaderLevel < 10)
            fillrate = 1000;
        else if (shaderLevel < 20)
            fillrate = 1300;
        else if (shaderLevel < 30)
            fillrate = 2000;
        else
            fillrate = 3000;
        if (cpuCount >= 6)
            fillrate *= 3;
        else if (cpuCount >= 3)
            fillrate *= 2;
        if (vRam >= 512)
            fillrate *= 2;
        else if (vRam <= 128)
            fillrate /= 2;

        var resX = Screen.currentResolution.width;
        var resY = Screen.currentResolution.height;
        const float targetFps = 60.0f;
        var fillNeed = (resX * resY + 400f*300f) * (targetFps / 1000000.0f);
        var levelMultiplier = new float[] { 5.0f, 30.0f, 80.0f, 130.0f, 200.0f, 320.0f };

        var maxQuality = QualitySettings.names.Length - 1;
        var level = 0;
        while (level < maxQuality && fillrate > fillNeed * levelMultiplier[level+1])
            ++level;

        var quality = QualitySettings.names[level];
        qualityLevel = quality;
    }

:
But it’s not working properly, it gave me “Medium” on Samsung S9 and “Ultra” on a very old Samsung device (i don’t really know its model name). It also gave me “Low” on my Macbook pro.

Plus i don’t quite understand the concept used in the code.

Can i anyone participate in making the function better? or maybe is there another way to determine Quality level?

Interesting approach, but it conflates CPU count with shader level with RAM… that’s weird. I have no idea where the original author would even have gotten that, or even what reasoning would lead him to come up with such a heuristic.

Also, do you even need to do this in your game? If you’re not having performance issues and you’re doing this, you are completely wasting your time and inviting more bugs for yourself in the future.

If you actually ARE having a problem, a far better approach is to attach the Profiler (Window → Analysis → Profiler) and see what is slowing you down on lower-end devices, and address that directly. Far less work and far more likely to actually produce good outcomes. The above script is sort of like “Well, I hope this makes it get better,” which isn’t really engineering, it’s more like witchcraft. :slight_smile:

1 Like

@Kurt-Dekker I totally agree with you on that matter, however, yes i’m having performance issues on lower-end devices because i’m using a somewhat expensive Shader Graphs in my game and i want to disable them or substitute them with cheaper ones if the device is actually a lower-end device, i don’t want to sacrifice the eye-candy in the game just to make it run exactly the same on older devices, i want the game to run more beautifully on modern devices and to run with steady FPS with less eye-candy VFX on lower-end devices. (without asking the user to choose quality by themselves).

Note: i already know what causing the FPS drop in the game, and i made a cheaper shader for that matter that i want to use if the device is a lower-end device.

I was considering a different approach for an auto-quality level. Pick a target FPS, and when FPS is less than 80% of the target for a significant period of time then drop the quality level, and if above 120% of the target then increase the quality level.

2 Likes

Interesting…