Today I ran the automated performance test of my game with Unity 2018.2.0b1 and present the results below, just like I did before, during older beta cycles:
- Unity 2017.2 Performance Overview
- Unity 2017.3 Performance Overview
- Unity 2018.1 Performance Overview
2018.2.0b1 is slower than 2018.1.0b9, but faster than Unity 2017.x with my project. 2018.2.0b1 performs about 0.7ms slower than 2018.1 in some cases.
I haven’t looked into what exactly is causing this performance difference yet.
How does the performance test work?
The (first-person) game I’m working on features an automated performance test. It works like this:
-
Camera is placed at a defined position in the scene
-
Camera makes a 360 degree rotation around the Y-axis within 20 seconds (slowly rotating around y)
-
Every second the test captures the average CPU frame time of the last second
The game runs just like it would normally do, but the camera/player does not move and the AI in unable to see the player.
It measures the “base-line” of the game so to speak. If an actual gamer plays the game, more things are going to happen, which is missing in the test, such as visual and sound effects when firing weapons, additional AI to hunt the player and more navigation pathing.
I run this test in a Standalone Windows 64bit (Mono) Player. The following Physics settings are applied:
-
Physics.autoSyncTransforms=false
-
Physics.autoSimulate=true
-
Physics2D.autoSyncTransforms=false
-
Physics2D.autoSimulate=false
The y-axis represents the average CPU time in milliseconds, how long “one frame took”. The x-axis represents the sample points while the camera makes its 360° rotation. Each test runs for 20 seconds, one sample every second, thus 20 samples for each test. Fewer milliseconds (vertical axis) indicate better performance.