How to change Graphic Tier?

I want to change Graphic Tier via script but don’t know how. Please, help.

No idea what that means. Are you speaking of the quality settings manager?

Start with the documentation, then if you have a problem…

How to report your problem productively in the Unity3D forums:

http://plbm.com/?p=220

This is the bare minimum of information to report:

  • what you want
  • what you tried
  • what you expected to happen
  • what actually happened, log output, variable values, and especially any errors you see
    - links to documentation you used to cross-check your work (CRITICAL!!!)

The purpose of YOU providing links is to make our job easier, while simultaneously showing us that you actually put effort into the process. If you haven’t put effort into finding the documentation, why should we bother putting effort into replying?

If you post a code snippet, ALWAYS USE CODE TAGS:

How to use code tags: https://discussions.unity.com/t/481379

  • Do not TALK about code without posting it.
  • Do NOT post unformatted code.
  • Do NOT retype code. Use copy/paste properly using code tags.
  • Do NOT post screenshots of code.
  • Do NOT post photographs of code.
  • ONLY post the relevant code, and then refer to it in your discussion.

You don’t change the graphics tier. The graphics tier is automatically determined by Unity according to the device the application is running on. So, you have to test on various devices to know how you want to handle each tier. More information on how tiers are determined can be found here: Unity - Manual: Graphics tiers One of the things I added to my standard logger output is to log the graphics tier when the app starts. So, here is an example of how all of my log files start:

[2023-07-18 14:29:24.801] Switchboard - Version 1.0 (Debug)
Platform: WindowsPlayer
OS: Windows 10 (10.0.19045) 64bit
Device: PX60 6QD (Micro-Star International Co., Ltd.)
CPU: Intel(R) Core™ i7-6700HQ CPU @ 2.60GHz
CPU Speed: 2.59GHz
CPU Cores: 8
RAM: 16GB
GPU: NVIDIA GeForce GTX 950M
VRAM: 2GB
Graphics API: Direct3D 11.0 [level 11.0]
Shader Model: 5.0
Graphics Tier: Tier3
Screen Resolution: 1920 x 1080 @ 60Hz
Display Mode: Windowed
Render Mode: MultiThreaded
Quality Setting: Default

Unity provides this setting so that you don’t have to write custom code that analyzes all of these other listed parameters to determine how you want to customize graphics based on so many different settings. They give you this equivalent of a low, medium, high hardware level to make it easier for people to customize graphics to hardware without having to think about it too much or write all of the hardware detection and graphics tailoring code yourself. You don’t have to use the different tier settings if you don’t want to, you could just change the quality settings programatically based on your own custom calculations, or leave it up to the user to change their own quality settings, if that’s what you prefer.