Okay so I just don’t know how Unity’s FPS work.
I’m trying to find out how much performance I actually have before I start accumulating Tech Debit with shaders and high poly mesh. But right out of the gate I hit a wall with performance profiling.
The unity editors stats window says I should be getting steadily over 85-120 FPS which I don’t at all mind starting from, but as you can see the stand alone build is running at 30 fps, I’ve verified these fps stats with the windows game bar performance tool, the asset Advanced FPS, and the third party RenderDoc, and they’re all darn close to each other, and very far from what unity displayed.
My last two images show a default hdrp outdoor scene with and without lights, and expanse unoptimized is running at the same FPS as the default HDRP scene (Roughly 60 fps).
And I don’t get why, the expanse is crazy more expensive to run than unity’s default scene I’m sure, and expanse with the default settings in the unity editor is displaying 180-200 fps.
This is 2020.3.11 and the latter 3 images were all captured from a virtually unmodified new project.
Does anyone know what might be going wrong here?
If so thank you in advance!
Here’s the demonstrables:
Here’s the scene I’ve spent all day trying to optimize, with expanse sky, crest ocean, unity fog, unity terrain.
30 fps.
Here’s Expanse in a scene by itself with its default settings which are very high by default. (60 FPS)
Here’s basic outdoor hdrp with no light or volume (61 FPS)
Here’s basic outdoor hdrp with default light and volume (50 FPS)
Vsync count was set to Every V Blank but I changed it to Dont Sync and that had no effect.




