FPS, is it that accurate?

Hey guys, yes I know the Editor FPS is different from the real FPS.
But I just noticed something with a Debug.Log…

I have the Debug log showing the real fps according to the way it works on the Unity documentation here’s the code.

        ++frames;
        float timeNow = Time.realtimeSinceStartup;
        if (timeNow > lastInterval + updateInterval) {
            curFps = (float)(frames / (timeNow - lastInterval));
            frames = 0;
            lastInterval = timeNow;

Okay anyways. (I only showed that just so you know how I am doing it). But the pressing issue I’ve noticed.
Not sure if it’s 100% accurate.

Okay the image below this shows exactly what I am talking about…
2791231--202392--FrameRate.jpg
Graphics Render Thread says 2651 FPS. But the real FPS shows 158. Is that really the real FPS, 158 FPS?
Or am I doing this wrong or something. I mean I could understand if it said the real FPS was like 2200 or something, just not 158, that seems rather low with Vsync off in comparison to 2651 in editor.

So what exactly is making such a substantial difference? I mean,
the render thread, what makes it skyrocket so high if it’s not even feasibly realistic according to the real FPS.

I’m just confused as to why they are so far apart from each other. And as mentioned, I know the render thread in editor is different from real fps, just why such a big leap in differences.

Looking forward to hearing your takes on this.

2791231--202392--FrameRate.jpg

I’m guessing its the Debug.Log. isn’t it super bad performance wise?

I’d hazard guess that “graphics” stat calculates “fps” based on amount of time it took to render the scene. Meaning 1.0/“render function duration”. This value will be unrealistic in almost all cases.

The real value would be amount of time that passed since the last frame to this one. That would be what you see in Debug.Log

While 165 is not 600…1200 fps you can get while writing your own application from scratch in C++, given the amount of stuff that is going on in unity render, coupled with possibly debug build, overhead introduced by Debug.Log (it is VERY slow) and editor UI, the 165 frames per second is okay.

1 Like

Yep, Debug.Log is super slow, but the fps is probably calculated before the Debug.Log does its slow thing. I think the editor FPS is calculated based on how much cpu time is editor using and some guesses for how high the fps should be if no editor was present. It doesn’t really work much though, I’m doing the same as you, but instead of Debug.Log with UI text, editor says 600-800fps and counter 150-200fps.

What do you see in the profiler?

Yes, you are. You’re measuring performance in the editor, which is useless and shouldn’t ever be done. Measure built game instead.

The difference comes from the fact that the editor needs to also render other stuff than the game view.

Thanks for the replies guys!

@Tautvydas-Zilys : Was correct. I mean I knew doing stuff in the editor for testing FPS and such wasn’t the greatest idea.
However, that example came directly from the documentation. (Granted a name or two of variables did change) but aside from that, he was correct. Testing outside the editor gave a consistent 1110 frame rate to a text UI.

Granted still no where near the render thread FPS. But at the same time, better than 165 haha.
So I’m guessing the 1110 is the true FPS?

Your real frame time will always be between max(main_thread_time, render_thread_time) and main_thread_time + render_thread_time depending on how much overlap they get. At the beginning of each frame, main thread always waits for the render thread to finish rendering the last frame.

Glad to know about that. Thanks man.

You need to understand what you mean by “FPS”. There are two ways to conceptualize this in Unity:

  1. Your frame render time according to Unity’s renderer.
  2. Your number of MonoBehaviour.Updates per second.

IMO only updates matter because that’s what your are working with in practice.

In my current project where I render only console I can get around 2000 render fps and around 300 update fps.

Bear in mind that FPS is nonlinear with respect to changes in workload - i.e. a 10FPS drop means very different things when you’re running at 2000FPS compared to when you’re running at 30FPS.

For that reason I always recommend measuring performance in mSPF instead of FPS. You might want to use FPS just to figure out your target mSPF at the beginning (i.e. 60FPS = 16.6ms/frame), but after that, stick to mSPF.

1 Like