I’m integrating Xbox Live services in my game and the only way (that wouldn’t drive me crazy) seems to be switching to .NET backend scripting.
I’m building and about to check if it will need a trick as well to use GPU. In that case, would it be so hard to achive it?
So the trick I gave you actually only makes sense on systems with both Nvidia/AMD and an integrated Intel GPU. By default all applications use the “default GPU”, which is usually low power Intel one. When in your original post you said “game is not using GPU”, you probably meant that it’s not using the high power Nvidia/AMD GPU, and instead is running on the integrated Intel GPU. So if you want your game to run on the Intel GPU instead, don’t specify those symbols. Not running on any GPU is not actually a thing.
No, what I wish now is to have better performances on my Surface Pro 3 with only Intel graphic adapter. Even tho my original question was a mere deduction I was’n really aware of what I was asking and yet you gave me the correct answer. I was asking myself if there’s a line of code to tell Intel HD “Hey, go faster!”.
It’s stupid, but there is no way to do it on .NET scripting backend since this requires using C++ in the executable. The only other way to do it is have driver vendors recognize your game and enable the GPU for it automatically. However, it’s not easy to contact them (usually you’d have to go talk with Microsoft and then they would contact the IHVs).
The reason Unity doesn’t do this by default is because most games made with Unity don’t need to use dedicated GPU, and using it when you don’t need it wastes battery life. I fully agree this should be a player setting.