While migrating our project (a fairly complex game) from Unity 2021.1.11f1 to 2021.2.0b6 we noticed that our average FPS went down significantly.
On our “heavy benchmark scenario” the average frame duration went up from 26.1ms to 31.6ms (FPS dropped from 38.3 to 31.7). Here’s a breakdown by percentile for a 30 second benchmark:
Percentile 95.0: 34.1 ms → 34.3 ms Percentile 98.0: 34.3 ms → 49.9 ms (!)
Percentile 99.5: 34.4 ms → 50.1 ms
Percentile 99.9: 50.3 ms → 50.5 ms
So the 95th percentile stayed the same but there are constant spikes, about once every 1-2 seconds.
We would normally suspect GC but GC spikes are usually rarer and longer and the project is fairly well optimized with regards to allocations.
We are still investigating it but so far we could not figure out what is causing those spikes. The profiler didn’t help much, because of how the profiler itself affects performance. We’ll update this thread if we find out anything.
Has anyone else seen degraded performance in 2021.2? Do you have any clues where we should look?
We did some additional benchmarking. The original benchmark had one caveat, we left VSync turned on, so it skewed the data, the new benchmarks are with VSync turned off.
The general conclusion is that on three different Windows machines we see a degradation in average and median frame duration by about 1 ms. On a Macbook Pro we see we see an improvement by 3-4 ms but on that same machine connected to an eGPU we see a degradation by 5 ms.
Below are the results for one of the Windows machines. Numbers after → are for 2021.2.0b6.
The results were very similar on all three Windows configurations even though they had very different CPUs and GPUs (including one Radeon).
And here are the results for the Macbook Pro with an external Radeon:
Disabling Depth Priming further improves the FPS on Windows. The difference between 2021.1.11f1 and 2021.2.0b6 is now only 0.3 - 0.6 ms (in favor of 2021.1.11f1).
The problem with disabling depth priming is that it results in a “A non-multisampled texture…” error being thrown every frame, described here:
So Depth Priming is very project/scene dependent when it comes to performance gains/loss.
I assume you do not have any SSAO (or post processing effects dependent on scene depth) enabled and/or you are using MSAA ?
If so forcing Depth Priming makes you pay the cost of a depth prepass and a resolve pass for the MSAA depth.
If your scene doesn’t have enough opaque overdraw and/or expensive shaders you won’t be able to overcome the added cost of Depth Priming to see any gains sadly.
2021.1.25f1 => 2021.2.7f1 upgrade: Drastic decline in performance (editor & builds). No change to project, just a direct upgrade and test.
VR.
2021.1.25f1 can handle the game at 144hz with plenty of headroom. 2021.2 can’t even handle 90hz.
yeh somehow my game is running like 50% slower in 2021.2.16f compared to 2021.1.24f. Very strange, i did upgrade to dx12 too so I cant be sure what it was like before but it felt sluggish.
I have no idea how to get Unity to repro this problem. It only seems to affect certain machines at work. It runs fine on some lower-end laptops, but on others it just bogs like crazy
Can you think of any way to get Unity to see this?
Swapped back to dx11, speed up a lot but still not like it was. Yeah, submit a bug report, I know a couple of guys too I will message them, but nearly always they say submit a report, which never goes well for me
Im looking through the masses of hdrp options thinking some might be on by default that werent beforee maybe. Right now still trying to speed it back up. Interesting how it runs on lower end laptops, got 2080 here and i9 and seems quite a bit slower. dx12 was a shock, super slow.