How to do I make a windows build default to using nVidia instead of integrated on a Windows laptop?

I am a developer. I’m building a game in Unity. My laptop runs Windows 8.1 and has an nVidia GT 750M and an integrated Intel HD 4600. When I build for Windows the game runs with the integrated graphics card, not the dedicated GPU, so quality and performance suffer.

I don’t care that this happens to me. I am a developer and I can find the end-user work-arounds to change the default graphics processor for this specific instance on my machine.

I do care that players of my game on laptops like mine will have a sub-optimal experience because they don’t know or don’t care to work out how to change the graphics processor option.

How do I make a Unity windows executable default to using the faster GPU, or explicitly give the option to the end-user?


The work around, before someone suggests it: right-click on the executable and choose “Run with graphics processor > High-performance NVIDIA prcessor”.

Please don’t tell my end-users they have to do this or similar. There are other end-user solutions and workarounds listed within the links below, but I’m asking for a developer solution, if anyone can help.


Sorry, this question has been asked before. However, I’m yet to find an answer for developers.

Here’s a selection of previous, similar questions (none with solutions for developers):

For completeness, here’s a list of others having had the same problem with other platforms:

I’ve found the documentation for how to direct the Optimus driver at runtime to use the High Performance Graphics

Starting with the Release 302 drivers, application developers can to render any application–even those applications for which there is no existing application profile.

http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf

I haven’t attempted to write code to do interface with this API as yet. It really ought to be done by Unity itself though, or at least be a Windows/Desktop build option.

If you look at The Optimus Rendering Policies (also cited by @pengo) you will find that there is a few ways to do it. I have found that the simplest solution is to do number 4: Static Library Bindings.

All you need to do here is to put one of the following libraries in your plugins folder: vcamp110.dll, vcamp110d.dll, nvapi.dll, nvapi64.dll, opencl.dll, nvcuda.dll, and cudart*.*

Note: I don’t know if there is some licensing issue with distributing some of these libraries as part of your game, check it for yourself.

OpenCL.dll can be retrieved from C:\Windows\System32\

That is all you need.

this isn’t a unity build issue (unless you’ve intentionally lowered the quality settings), it’s either BIOS/EFI related to the laptop and or driver related… unity will use whatever video card the system hands off to it for use.

just taking a guess… it’s probably a Dell product (?), then you either have to find that magic key combo that switches that Quantum garbage to the GTX, update to the latest drivers, or figure out how to disable the Intel integrated card. Even though the HD4000 isn’t the biggest kid on the block it can handle stuff most stuff well enough… unless you’re trying to run Crysis in ultra mode.

have you tried testing the build on a desktop with a dedicated graphics card?