99% GPU load with background texture using OnGUI function

On behalf of my concerned video card :(,

When I build and run my game, my GPU monitor always registers a 95%-99% load (with an accompanying GPU temperature rise) whenever I am sitting on my Start Menu or Game Over screens - both feature a static background texture called from an OnGUI() function.

Doing a bit of digging, I’ve tried setting my backdrop texture to ‘None’ - this results in a significant GPU use drop (hovering around the 67% usage mark). Also, if I test my game using Unity’s Play mode, my GPU remains unconcerned. The GPU overload only occurs when running an actual build of my game.

A couple more details about this behaviour:

  • using an ATI Radeon 4800 series card - 1gb video RAM
  • GPU usage shoots up to 95%+ no matter what resolution I use, windowed or full-screen
  • backdrop texture’s resolution is 1024x577; Texture Type: GUI; Max Size: 1024; Format: Compressed
  • example code used for creating Start Menu GUI (using C#):
using UnityEngine;
using System.Collections;

// Make the script also execute in edit mode
[ExecuteInEditMode()]

public class GUI_StartMenu : MonoBehaviour
{
	public GUISkin gSkin;
	public Texture2D backdrop; // our backgrop image goes here.
	private bool isLoading= false; // if true, we'll display the "Loading..." message.

	void Awake()
	{
		Screen.lockCursor = false; // Ensure we have a mouse cursor to click buttons with.
	}
	
	void  OnGUI ()
	{
		if (gSkin)
			GUI.skin = gSkin;
		else
			Debug.Log("StartMenuGUI: GUI Skin object missing!");
	
		GUIStyle backgroundStyle = new GUIStyle();
		backgroundStyle.normal.background = backdrop;
		GUI.Label ( new Rect( (Screen.width - (Screen.height * 2)) * 0.75f, 0, Screen.height * 2, Screen.height), "", backgroundStyle);
		GUI.Label ( new Rect( (Screen.width/2) -197, 0, 400, 100), "Project MECHA", "mainMenuTitle");
	
		if (GUI.Button( new Rect( (Screen.width/2) -70, Screen.height - 160, 140, 70), "Play")) 
		{
			isLoading = true;
			Application.LoadLevel("Level01"); // load the game level.
		}
	
		bool isWebPlayer= (Application.platform == RuntimePlatform.OSXWebPlayer || Application.platform == RuntimePlatform.WindowsWebPlayer);
		if (!isWebPlayer) 
		{
			if (GUI.Button( new Rect( (Screen.width/2) -70, Screen.height - 80, 140, 70), "Quit")) Application.Quit();
		}
	
		if (isLoading)
			GUI.Label ( new Rect( (Screen.width/2) -197, (Screen.height/2) -60, 400, 70), "Loading...", "mainMenuTitle");
	}

}

Would anyone happen to know what might be causing such a crazy GPU load when using a background texture and OnGUI? I’m worried of my users’ video cards frying themselves sitting on the game’s menus.

That’s expected; games naturally try to run as fast as possible unless told otherwise. Use Application.targetFramerate for the “unless told otherwise” bit.

–Eric

Hmm, strange that it would be running at load only when a background GUI texture is used. In any case, to rectify this issue on my side, what would be a recommended value for Application.targetFramerate? Would 60 (fps) be a good figure to reduce GPU load, or is there a higher value that most people tend to cap it at?

Personally I use 120, because that’s what my monitor runs at. 60fps only seems smooth if you’re not used to 120fps… You can also use vsync to lock to the screen’s refresh rate, though in certain cases that can be problematic if your game is visually demanding, depending on the hardware you’re targeting. I wish there was a way to turn that on and off programmatically, instead of having to rely on quality settings.

–Eric

@ Eric5h5
Thanks for the awesome advice! Using Application.targetFramerate = 120 reduced my GPU load from 95%+ to 8%, as you would expect for a static screen. :smile:

I guess in my case, setting the framerate to 60 wouldn’t be noticeable, as my scene is just displaying a static image. I can imagine 60fps might be noticeable once we get into actual gameplay.

Hmm, just out of curiosity, why would you want to handle vsync behaviour outside of player preferences? Most PC games tend not to force vsync on players (the sheer variety of graphics cards and drivers don’t always play nice with vsync). Forcing vsync tends to be a console dev thing.

I wouldn’t; that’s the point. There’s no way to enable/disable it outside of the quality settings, and it would be nice to be able to toggle it separately.

–Eric

Ah, I see. That’s pretty annoying from an end user’s perspective. :frowning: