Okay anyways. (I only showed that just so you know how I am doing it). But the pressing issue I’ve noticed.
Not sure if it’s 100% accurate.
Okay the image below this shows exactly what I am talking about…
Graphics Render Thread says 2651 FPS. But the real FPS shows 158. Is that really the real FPS, 158 FPS?
Or am I doing this wrong or something. I mean I could understand if it said the real FPS was like 2200 or something, just not 158, that seems rather low with Vsync off in comparison to 2651 in editor.
So what exactly is making such a substantial difference? I mean,
the render thread, what makes it skyrocket so high if it’s not even feasibly realistic according to the real FPS.
I’m just confused as to why they are so far apart from each other. And as mentioned, I know the render thread in editor is different from real fps, just why such a big leap in differences.
I’d hazard guess that “graphics” stat calculates “fps” based on amount of time it took to render the scene. Meaning 1.0/“render function duration”. This value will be unrealistic in almost all cases.
The real value would be amount of time that passed since the last frame to this one. That would be what you see in Debug.Log
While 165 is not 600…1200 fps you can get while writing your own application from scratch in C++, given the amount of stuff that is going on in unity render, coupled with possibly debug build, overhead introduced by Debug.Log (it is VERY slow) and editor UI, the 165 frames per second is okay.
Yep, Debug.Log is super slow, but the fps is probably calculated before the Debug.Log does its slow thing. I think the editor FPS is calculated based on how much cpu time is editor using and some guesses for how high the fps should be if no editor was present. It doesn’t really work much though, I’m doing the same as you, but instead of Debug.Log with UI text, editor says 600-800fps and counter 150-200fps.
@Tautvydas-Zilys : Was correct. I mean I knew doing stuff in the editor for testing FPS and such wasn’t the greatest idea.
However, that example came directly from the documentation. (Granted a name or two of variables did change) but aside from that, he was correct. Testing outside the editor gave a consistent 1110 frame rate to a text UI.
Granted still no where near the render thread FPS. But at the same time, better than 165 haha.
So I’m guessing the 1110 is the true FPS?
Your real frame time will always be between max(main_thread_time, render_thread_time) and main_thread_time + render_thread_time depending on how much overlap they get. At the beginning of each frame, main thread always waits for the render thread to finish rendering the last frame.
Bear in mind that FPS is nonlinear with respect to changes in workload - i.e. a 10FPS drop means very different things when you’re running at 2000FPS compared to when you’re running at 30FPS.
For that reason I always recommend measuring performance in mSPF instead of FPS. You might want to use FPS just to figure out your target mSPF at the beginning (i.e. 60FPS = 16.6ms/frame), but after that, stick to mSPF.