CPU core balance failing with batchmode

Hey guys,

I’ve noticed something really strange recently and decided to set up a test project to show you the problem:
Windows fails to balance processes correctly when using a Unity standalone build with the “-batchmode” option.

The demo (based on a 4 CPU cores machine):

1) No batchmode

  • Open and unpack this zip file
  • Open the Windows task manager and select the “Performance” tab
  • Double click on the no_batchmode.bat file 4 times (this will launch 4 instances of the game)
  • Go back to the task manager, you should see something like that :

Which means that your game processes are correctly balanced between cores.

2) Batchmode on

  • Now close the 4 running games and wait a few seconds until the cores are available
  • Double click on the batchmode.bat file 4 times (this will launch 4 instances of the game with the batchmode parameter passed in command line)
  • Go back to the task manager, you should see something like that :

Which means that all the game processes are running on the first core and don’t get balanced at all!
Resulting in poor performances (cf FPS in logs).

Please tell me I’m missing something and it’s not a bug from Unity or (worst) Windows.
Please.

I am not on a windows machine right now but can you try doing:
start test.exe -batchmode

if that does not work you might have to set the specific core you want it to run on: http://www.howtogeek.com/howto/windows-vista/start-an-application-assigned-to-a-specific-cpu-in-windows-vista/

so to run on core 0:
start /affinity 1 test.exe -batchmode

This as a better description of the number after the affinity command: http://www.techrepublic.com/blog/window-on-windows/change-the-processor-affinity-setting-in-windows-7-to-gain-a-performance-edge/5322

From the article:
"The number that follows the start /affinity command is called the affinity mask and is defined as a hexadecimal number. However, the CPU core number can be calculated more easily using binary numbers. For instance, the command

C:\Windows\System32\cmd.exe /C start /affinity 3 dfrgui.exe
will launch Disk Defragmenter on both CPU 0 and CPU 1. If you convert 3 into a binary number you will get 0011. Under the affinity mask system, processors are numbered from the right to left beginning with 0 and since there are 1’s in the first two places, this indicates CPU 0 and CPU 1.

Suppose you have a Quad core processor. If so and you use an affinity mask of 4, that will convert into binary 0100, which indicates CPU 2. If you use an affinity mask of 9, that will convert into binary 1001, which indicates CPU 0 and CPU 3."

Thank you for you answers Issam.

I tried to start it with the “start” command but I had the same results.

In fact, I’m aware of the core affinity trick, that’s the solution I first choose for my game servers.
But I have CPU peaks every time the server loads a map, so limiting my servers to one core isn’t a viable solution because every CPU peak will impact the other game servers on the same core (yes, that’s a bit tricky to explain).

I want windows to handle the core balancing so I can use the whole cpu capacity.
Several windows administrators also firmly discouraged me from using core affinity, saying windows should be the one handling all this…

Hey there,

that is really odd, I’ve had a chat with devs about it, could you bug report it so they can have a closer look at it?

Thanks

Yep, I’ll be working on it today.

There is no other option cause Unity still has not fixed the bug in there yet.
We already reported that bug back when batchmode was finally working correctly as a mode to be used, no idea what version that was in the Unity 2.x days.

If you know that there is something else going on on the system too, then the easiest thing is to declare one core as ‘loader core’ where also the other non gaming stuff happens (I guess you have some kind of a cron job / server master app running that starts and stops instances or alike basing on need which it gets from some backend place) where newly loaded applications are set to affinity wise through above line and once they are loaded they change their affinity to the real core they are meant to use (you can provide that through the command line or a config or whatever) through Win32.

Its dirty, its extra work but it at least works.

That’s what I had in mind as last resort solution but it will mean “wasting” a core we could use to host games.

Good news! The dev I spoke to fixed it already! So no need to report it, although the fix will be released in 4.0.

There will be a 3.5 fix too!!

That’s a good news!
What’s the ETA for the next 3.5 release? :slight_smile:

No ETA as of yet but I will try and keep you posted!

Thank you very much!

thanks for the informations, thats great news.

Fixed in 3.5.6!!! \o/
http://unity3d.com/unity/whats-new/unity-3.5.6

Yup, WOOHOO sorry I didn’t post on here earlier!