Hello everybody, here we go again with another performance overview update
Today I ran the performance tests of my game with Unity 2021.1.0a2 and 2018.1.9f2. I present the results below, just like I did before, during older beta cycles.
Older Reports
According to my test, 2021.1.0a2 is slower than 2018.1, which is still the fastest Unity version with my project. I profiled all builds listed in the graphs below during the same session.
You can run the test using the project attached to the following bug-report:
(Case 1108597) Project to find performance regressions and bugs
How does the test work?
The indoor first-person shooter I’m working on features an automated performance test. It works like this:
Camera is placed at a defined position in the scene
Camera makes a 360 degree rotation around the Y-axis within 20 seconds (slowly rotating around y)
Every second the test captures the average CPU frame time of the last second
The game runs just like it would normally do, but the camera/player does not move and the enemy AI in unable to see the player.
The performance test measures the “base-line” of the game basically (rendering, shadows, realtime lighting, physics, audio, NavMesh, UI, game code). If an actual gamer plays the game, more things are going to happen, which is missing in the test, such as visual and sound effects when firing weapons, additional AI to hunt the player and more navigation pathing.
I run this test in a Standalone Windows 64bit (Mono) Player:
2018 = .NET3.5
2020 = .NET4.x
I’ve also enabled “Incremental GC” on 2019.
The following Physics settings are applied:
Physics.autoSyncTransforms=false
Physics.autoSimulate=true
Physics2D.autoSyncTransforms=false
Physics2D.autoSimulate=false
The resolution is set to fullscreen 320x240 (D3D11) to make sure the bottleneck isn’t GPU. The game uses the built-in deferred renderer.
The y-axis represents the average CPU time in milliseconds, how long “one frame took”. The x-axis represents the sample points while the camera makes its 360° rotation. Each test runs for 20 seconds, one sample every second, thus 20 samples for each test. Fewer milliseconds (vertical axis) indicate better performance.
I profile each build three times and the graphs below display the minimum sample value (best performance) of these three profiling runs. I profile multiple times, because it improves the overall robustness of the test. In the past I only profiled a build once, but it caused that things like OS activity sometimes affected the timings and the overall graph was a bit more spiky.
I run each test three times and use the minimum value at each sample for the graph. So the graph represents the best performance of these three runs. The reason why I perform the tests three times is to reduce the error of OS activity that might affect the performance.
These issues are probably not reproducible if you’re running the game on high-end hardware. Please use a rig similar to the hardware I used to submit the bug-report.
I feel that for graphs like this to matter, we should see more near term results on the same graph instead of just comparing against what was once fastest. That 2018.1 is not feasible platform to use for many as there’s no support for it now and haven’t been in years.
IMHO it would be nicer to see comparison against 2018.4 LTS, 2019.4 LTS and latest 2020.1 + 2020.2 beta on same graph.
You manually change to .NET4.x on 2020+? I think it defaults to .NET Standard 2.0 nowadays. I’d like perf tests to use whatever defaults these engines set for us as that’s basically what Unity “recommends” for us.
I’ll take the 1ms overhead …good to know that’s all it is - thanks @Peter77 - and looking forward to hearing how the test with .NET Standard 2.0 goes if you do one, as that’s what I use.
If “what was once fastest” isn’t the benchmark for performance, then what is?
I guess the option would be to accept a steady performance drop, year after year. Which means, it goes undetected that recent Unity releases might be significantly slower, if the benchmark isn’t the fastest release.
Because of that reason I should lower the bar and accept the performance drop?
I don’t know if this would do any of us a favor in the long run. It would probably encourage Unity Technologies to care even less about these things, “Because who cares?!”.
However, the point you made is excellent. It shows that Unity Technologies are unable to fix performance regressions in a timely manner, where “timely manner” really means a 2-3 years time frame so far.
I wouldn’t have to rely on an unsupported version in the first place, if they would just fix such issues faster (or at all).
That’s actually what I’m trying to achieve with these performance overview thread, right? I want them to fix the regressions, so that I can use a new and supported Unity version as benchmark instead!
I did this from time to time. The general trend is “every version other than 2018.1 is slower”: https://discussions.unity.com/t/780489 page-2#post-6103425
Hmm, not that I know of.
100%
All my “Performance Overview” posts are related to performance only. They serve to show Unity Technologies when I open an existing project in a newer Unity version that it runs slower.
I’m fully aware that LTS releases should be the better option for stability, or that TECH releases could come with improved usability, but that’s not what the performance overview posts are about.
My point was more of that for me it’s pointless to know that a version which practically was never production ready is this much faster + I’d really love to see actual true comparison between major Unity releases. You have all the data in spreadsheets already so it shouldn’t need much effort to put at least previous LTS releases in the graph, no?
As long as Unity has done the TECH stream + LTS thing it’s pretty much been the same as treating everything but LTS version as previews as TECH streams have super short “support” cycle. I personally couldn’t care less how some individual old TECH or beta release performed faster than current version but I’d really love to see the regression against LTS or some other “proper” releases. Only tech / betas perf I really care about are the current ones as they shape the upcoming LTS. But of course this is just my POV.
I do appreciate you doing these tests, just trying to tell how it could give more data to people watching these threads. Just opening one of these threads now doesn’t really tell if there’s been any regression to say, previous LTS unless you open your previous threads and try to guesstimate the values from graphs.
As additional thought, since this benchmark really uses old built-in renderer which hasn’t practically been developed in years it could be nice to see if someone did similar URP / HDRP benchmark to see how the perf progressed with these using the latest release/verified packages. I bet you’d see quite a lot more variance there (and probably better perf on newer Unity versions too, not just regressions).
I have to agree that the graph would be more interesting/useful to see not only how it compares to the fastest previous version, but how it compares to recent versions as well.
It might be nice to see the fastest version, the most recent two LTS versions (e.g. 2018.4.x and 2019.4.x), and possibly the previous/current TECH release (e.g. 2020.1.x) if appropriate, all in one chart.
That will give us a better idea of the overall trajectory of the Editor’s speed over time. Is it getting closer to the ideal/fastest previous speed as each subsequent version is released? Or is it getting even slower than previous releases?
With only the latest benchmarks compared to the fastest, this context is missing and it is hard to know if Unity is improving or getting worse.
But I realize that in suggesting this, I’m not the one who has to do the extra work. So even if you decide to keep things how they are, I appreciate you making these posts! I certainly follow them with interest every time one crops up.
I don’t see the point in such a benchmark too be honest.
Engines and games evolve and implement more features that make it run slower, while hardware manufacturers release faster hardware. This is how our industry is since the first video games came out.
If you could somehow disable all new features and benchmark it, then that would actually be more meaningful. Maybe when unity is shipping more parts of the engine as packages this might be doable with disableing packages.
I really aprecciate you taking the time to do this @Peter77
Few days ago, I upgraded my (android) project from 2019.2 to 2019.4 LTS, just to see that the perfomance dropped by 30%-40% in testing devices! Went back to 2019.2 and the problem was solved, so definately something is wrong with newer unity versions.
I agree that it would be nice to see comparison against 2018.4 LTS, 2019.4 LTS, to see if there is any perfomance improvement compared to the LTS versions.
So far I just didn’t think of this as a service for the community. By the way, I did this on multiple occasions on request: https://discussions.unity.com/t/780489 page-2#post-6103425
It’s a great idea, but the “benchmark” is actually a complete first person shooter game that runs on its own for these tests. Unfortunately I can’t make the project public.
The project uses deferred rendering, which isn’t available in URP at the time of writing. Plus, the last time I checked URP forward, it caused all sorts of light-flicker issues in this project. A few days ago, I tested URP for a different project and found various shadow issues. Took me less than a minute. Not sure how all this can actually pass QA, but that’s a story for another book.
I haven’t tried HDRP though.
In either case, I would need to recreate all the custom shaders, postprocessing, etc which is why I wrote it’s not that simple.
The game isn’t GPU bound too, I perform all tests in 320x240px.
However, the rendering side of the game is also really simple. The last time I checked with a Profiler , the built-in renderer appeared slightly slowe in newer Unity releases. The regression does not seem to come from rendering (entirely).
I repeated the test with .NET Standard 2.0, it doesn’t make a different performance wise. It would have been so awesome, if suddenly 2021 would’ve outperformed 2018
2021.2.0a4 with .NET Standard 2.0
Hello again, I just spent arround 14 hours doing a lot of tests, this is what I can say:
Profilers tend to show perfomance improvements with each newer unity version, but when building to devices, we actually get a perfomance drop. (tested with android devices only)
Unity 2019 LTS is a FPS nightmare, if you are using that version I recomend to downgrade to 2019.2 or upgrade to 2020.1 and use URP if possible.
URP gives a huge GPU perfomance improvement (and when I say huge, I mean HUGE) but as the CPU stays the same, FPS doesn’t increase (atleast with my project).
Unity 2018.1 haves a really good perfomance, as @Peter77 shows in his graphs, but I somehow got a better perfomance during the tests using the latest 2019.2 version, wich is 2019.2.21, keep in mind that this can be due to me optimitzing the game for the past 3 months, using 2019.2.13. (so it could be that the game is “shaped” for 2019.2) It would be cool if someone could test this with their project and see if they also get better perfomance with 2019.2.21. (and yes, I was using the built-in render pipeline for both the 2018 and 2019 tests)
I always see these performance comparison topics, they are very good for showing the evolution of unity.
Peter’s work is incredible, but it would be great to have some standardized test created by the community, something that everyone could test (because of the different hardware), and that it was possible to reuse for various versions.
For example: create a project that could test each section of the unit (Physics, animation, Used RAM, Script, and FPS averages) That would be cool … it doesn’t even have to be a real game, just a simulation of a possible game, with textures, animation and things happening … it would already be an excellent basis for comparison (especially if it includes the SRPs)
These tests are frankly Unity’s responsibility. If they want to pay people to design and do the tests for them, (I mean, I’d be all for them hiring Peter77), then that’s fine, but otherwise Unity should be doing the testing.
When you go to a restaurant as a customer, is it your responsibility to organise taste testing sessions so the chef can improve his food?