Windows store universal app not using GPU

Hi,
I deployed my game on my laptop and, contrary to desktop version, NVidia GPU des not display the game as using GPU features (hence low fps).

Is there something I should set in player settings or build settings or anything else?

Thanks to anyone who would help, the game is about to be released and we’re facing this problem now.

Hey are you on .NET or IL2CPP scripting backend?

IL2CPP, should I switch to .NET?

No, using IL2CPP actually makes it much easier. There’s not a good way to achieve it on .NET.

Put this piece code in one of the source files (I’d do it in Main.cpp, just above wWinMain) in the generated VS project:

extern "C"
{
    __declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
    __declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

That should make drivers default to dedicated GPU for your executable.

2 Likes

It works!! Can’t wait to test it on Xbox One!!!

Next time you’re in Italy a free beer is for you :slight_smile:

1 Like

I’m integrating Xbox Live services in my game and the only way (that wouldn’t drive me crazy) seems to be switching to .NET backend scripting.
I’m building and about to check if it will need a trick as well to use GPU. In that case, would it be so hard to achive it?

Which Unity version are you on? Xbox Live should be as easy to use on IL2CPP as it is on .NET in Unity 5.6.

But yeah, on .NET you cannot export these symbols as the exe file isn’t built from C++ code. It will be more work to make it use the dedicated GPU.

Hey, is there a way to do the same trick for an Intel HD?

So the trick I gave you actually only makes sense on systems with both Nvidia/AMD and an integrated Intel GPU. By default all applications use the “default GPU”, which is usually low power Intel one. When in your original post you said “game is not using GPU”, you probably meant that it’s not using the high power Nvidia/AMD GPU, and instead is running on the integrated Intel GPU. So if you want your game to run on the Intel GPU instead, don’t specify those symbols. Not running on any GPU is not actually a thing.

:slight_smile: No, what I wish now is to have better performances on my Surface Pro 3 with only Intel graphic adapter. Even tho my original question was a mere deduction I was’n really aware of what I was asking and yet you gave me the correct answer. I was asking myself if there’s a line of code to tell Intel HD “Hey, go faster!”.

If only it was that easy :).

I’m also interested on this… (xbox solution).

— Edit:

I found this guide, but actually I haven’t tested it:

Why isn’t this a checkbox in the PlayerSettings?

2 Likes

Is there a way to make this work for the .NET backend?

It really seems like this should be something Unity does by default.

1 Like

It’s stupid, but there is no way to do it on .NET scripting backend since this requires using C++ in the executable. The only other way to do it is have driver vendors recognize your game and enable the GPU for it automatically. However, it’s not easy to contact them (usually you’d have to go talk with Microsoft and then they would contact the IHVs).

The reason Unity doesn’t do this by default is because most games made with Unity don’t need to use dedicated GPU, and using it when you don’t need it wastes battery life. I fully agree this should be a player setting.

Is this on some (internal) task-list or roadmap, to make sure a developer picks/gets this task eventually?

Yeah I just added this to our task list yesterday. I think we’ll be making this setting available for both standalone player and UWP.

4 Likes

Not much luck there eh? :smile:
Simple solution that could make it a single switch in menu, so many UI/UX redesigns later, year 2021 and nobody bothered.

:smile: