Ways to reduce on GPU Load/Temperature (Overheating on ATI)

Hi,

We’re trying to work out why the GPU usage of our application seems to be really high. We don’t consider the application to be very intensive, but we always seem to sit at GPU load of 100% using GPU-Z to monitor these stats. The application simply displays UI elements that float across the screen, loading quite a few textures but only uses up to around 300 MB of VRAM at any one time.

We seem to have a problem where, if we export our application via one version of Unity (5.3.3 on Windows) and run it on our ATI machine, the GPU will overheat, and we can see this happen consistently within about 15 minutes (during which the application seems to force the system to 100% GPU load).

But, if we export our application via our Unity editor (5.3.3) on OSX, the GPU always stops at around 98/99 degrees, after running for at least 40 minutes.

So really this is two questions:

  1. Are there sure-fire ways to cause the Unity application to be less heavy on the GPU? It seems like nothing we do really helps. Disabling V-sync and setting a target framerate of 30fps seems to yield the same results as setting a target framerate of 60fps.

  2. Does this issue with overheating + ATI GPUs + Unity 5.3.3 sound familiar at all?

Thanks

Typically any demanding game will overheat that card, it means the card needs replacing or the drivers reset so the fan control / throttling works correctly. It’s not unique to Unity.

A computer program can’t overheat a gpu until it crashes, that still falls under “normal use” so the hardware / drivers are defective. Anything like furmark would do the same (although nvidia drivers detect furmark now).

I would imagine a whole lot of games will crash a lot of gpus with vsync off on simple scenes because the gpu is allowed to go as fast as it possibly can = more heat… which is why vsync on is generally the default.

This might seem odd but the gpu can often be more stable with more going on, as it stalls more, while it waits for cpu to feed it… doesn’t change the absolute fact the gpu is failing / fan failing or drivers being the problem (it should throttle before it fails).

If I understood you - it’s not actually crashing yet so it’s probably just working as intended for vsync off and minimal cpu time spent = worst heat scenario. Just make sure the application doesn’t run past monitor refresh rate.

Hi, thanks for your quick reply!

What’s strange for us, though, is that if we export our application using the Windows Unity Editor (it may be a slightly different point release, I’ll have to check), and deploy that build on this machine, it overheats (it doesn’t overheat other machines with Nvidia cards etc.). It overheats and shuts down every time, and within a similar time range. But, if we run either our own Unity application, or the Robot Lab sample Unity application, exported by Unity on OSX, this doesn’t happen.

Thanks

Apologies for a re-post, it’s just in case anyone has any ideas after hearing the extra info:

We have a dual GPU setup (two identical GPUs, I’m 99% sure of this), and using the GPU-Z monitor, one GPU always works up to full usage (100% load, 100% fan, and the temperature rises consistently), but the other GPU never hits full capacity. I’d have hoped that they would balance out to lighten the load, though I hadn’t thought through fully why they would do this or if it’s possible in the context of a single application.

Would it be considered possible in any way for Unity, at a lower level, to prevent this process from running smoothly?

Thanks