Use of multiple video cards running multiple instances of a Unity standalone player on Windows

I have an unusual use case. In Windows 10, running a Unity standalone application we’re building, we are able to run multiple instances of the application simultaneously on the same PC (on the order of a dozen). The application renders stuff and records video (using AVPro Movie Capture), and that currently works. We want to generate these videos as fast as possible. So, we would like to take advantage of multiple video cards on the same machine.

By default, would Windows/Unity just use one of the installed and enabled video cards, or would it automatically take advantage of other card(s)? (I understand that SLI would make multiple video cards act as one, more powerful, video card…would that actually help when running 12 instances of the application?)

I’ve read the docs for the “-adapter N” command line argument, but using that does not seem to make a difference in my tests with two video cards in one machine. (My hope would be, that in launching the 12 instances, I could start 6 of them on one “adapter” and the other 6 on the other “adapter”.) Does that make sense? And is “N” zero-based, so that “0” means the first adapter found, and “1” means the second, etc?

Using Unity 5.0.1f1 currently. Any help would be greatly appreciated.

Hi,

-adapter option actually refers to d3d9 adapters, which are actual “outputs” of the graphics card, for example, monitors. To force a specific graphics card, use the “-gpu” flag. For example, launch one instance with “-gpu 0”, and another instance with “-gpu 1”. By the way - this will only work if you’re using D3D11 or D3D12 as your graphics API in player settings.

2 Likes

Thanks! I was not aware of the “-gpu” command line argument. I’m using this now, but it’s not clear to me that I’m using it correctly. I’m using TechPowerUp GPU-Z to monitor GPU use (watching the “Memory Used” and “GPU Load” (%) closely. In running my tests, I generally see the first GPU being used even if I pass “-gpu 1”. I verified that in File → Build Settings… → PC, Mac & Linux Standalone → Player Settings… that “Use Direct3D 11*” is checked on. (I don’t see an option for D3D12…perhaps because I’m still on Unity 5.0.1f1?) Also, I’m generally running in windowed mode, not full-screen…does that make a difference?

I’m afraid the “-gpu” flag was only added in Unity 5.2 :(.

1 Like

Ah, thanks

Thanks, bit late though for us ;). In 4.x we’ve managed to get around this using a DX11 wrapper dll. Let’s see if we can get rid of this now :).

Hi. Do you know a way to assign a GPU per Camera (or per targetDisplay) in a single Unity instance ? This should be the ultimate solution for multi-display rendering (CAVE system for example) using only one computer. Thanks

There no way you can do that in Unity right now.

@Tautvydas-Zilys can we use -gpu 0 arguments in unity 2018.2.8f1 ?

It will allow you to select which GPU you want the game to run on, but you can’t use different GPUs per camera.