Why does it take 5-10 % GPU usage to render one image on play mode (both on editor and on build)? I have vsync on to cap my FPS to 165.
I have an RTX 2070 so it should take like almost 0 % GPU usage to render that image. It even takes 3-5 % even if there’s nothing on the scene. Shouldn’t it take about 0 % if the scene is empty?
And the scene may not be as “empty” as it looks. Unless you specifically disabled it there’s a skybox, for instance.
And many rendering effects are fixed cost (or have a minimum cost) regardless of what’s in the scene. Blurring an empty picture takes just as long as blurring a complex one, for instance. There are colour / depth buffers which are getting checked / cleared whether or not you did anything with them, too.
And there are some overheads just for running a rendering context. Your video driver is doing a bunch of stuff whether or not you send it any (visible) content to draw.
When I was playing around with Direct12 in my own application, with only the swapchain present with no actual loaded geometry and shader, I got my GPU up to 80%-90%. At 1.3 million fps
But event with vsync and 60 fps the GPU was at about 2%
Does this “percentage GPU use” take into account the different power states of GPUs? I don’t think any of them do.
I run an app to force my GPU into it’s lowest power state as with my triple monitors it normally wouldn’t, it sticks to a higher state. When in that lowest power state, it can run my game (at usable framerate btw) and task manager will record that as being “100% utilization” despite the GPU running at about 20W of power instead of the ~180W it uses in max power state.
If your GPU usage is reported as less than say 60-80%+, I wouldn’t trust it as a percentage of total power. More of a general indicator that it’s not the bottleneck. It depends a little on when the driver decides to up the power budget; let’s say it needs to run at 60%+ utilization to bump up a power state, and below 20% to bump down a power level.
As mentioned, no, but more importantly, this line of reasoning doesn’t matter. Do not take these metrics into account until you’re actually seeing performance issues.
Indeed. For the OP, typically, none of the stuff involved in this is optimised for the case where it’s doing nothing. It’s optimised for mid- to high-load cases, which often essentially means doing “prep work” which has a fixed overhead. That overhead is noticeable when nothing else is going on, but becomes a net gain very quickly as you add load to the system.
“My car is using power even though it’s not moving yet!” Yeah, it is, and without it you can’t start moving.
at an extreme end, I’ve had people tell me that the performance results in my game are impossible, and meanwhile they are conducting intensive, “scientific” research in which they compare empty scene performance results from one editor version to another.
I don’t know anything; I just push the buttons and follow basic guidelines you can read anywhere.
Looks like waste of time to me to measure arbitrary stuff like that. Assuming goal was to finish a game, and not be some sort of pseudo-rendering engineer.
Not knocking on the original question I think it was genuine curiosity which is always fantastic - just making point because I think sometimes it becomes procrastination to dig into tiny details that don’t matter instead of making the actual difficult decisions.