which Frames Per Sec measure is more accurate?

I’m using two FPS functions in order to try and get an accurate measure of my FPS. One is the built in one you get from the “stats” window and the other is a one derived from a script. The script is the following:

using UnityEngine;
using System.Collections;

public class HUDFPS : MonoBehaviour
{

    // Attach this to a GUIText to make a frames/second indicator.
    //
    // It calculates frames/second over each updateInterval,
    // so the display does not keep changing wildly.
    //
    // It is also fairly accurate at very low FPS counts (<10).
    // We do this not by simply counting frames per interval, but
    // by accumulating FPS for each frame. This way we end up with
    // correct overall FPS even if the interval renders something like
    // 5.5 frames.

    public float updateInterval = 0.5F;

    private float accum = 0; // FPS accumulated over the interval
    private int frames = 0; // Frames drawn over the interval
    private float timeleft; // Left time for current interval

    void Start()
    {
        if (!guiText)
        {
            Debug.Log("UtilityFramesPerSecond needs a GUIText component!");
            enabled = false;
            return;
        }
        timeleft = updateInterval;
    }

    void Update()
    {
        timeleft -= Time.deltaTime;
        accum += Time.timeScale / Time.deltaTime;
        ++frames;

        // Interval ended - update GUI text and start new interval
        if (timeleft <= 0.0)
        {
            // display two fractional digits (f2 format)
            float fps = accum / frames;
            string format = System.String.Format("{0:F2} FPS", fps);
            guiText.text = format;

            if (fps < 30)
                guiText.material.color = Color.yellow;
            else
                if (fps < 10)
                    guiText.material.color = Color.red;
                else
                    guiText.material.color = Color.green;
            //  DebugConsole.Log(format,level);
            timeleft = updateInterval;
            accum = 0.0F;
            frames = 0;
        }
    }
}

Before I implemented Aron Granberg’s A*, they were almost always accurate (tho the script one would rarely go higher than 60 FPS). Now I’m getting an extremely inaccurate difference between the two. When I play the game, I’m not getting the feeling that it is 17 FPS, but then why would it say 17?

Is the Unity FPS counter very accurate and should I just trust that one?

576008--20546--$Capture.JPG

Running a frames-per-second counter script such as the one you posted, in an actual build, is the only way to get accurate frames per second. Nothing in the editor is accurate, since you have the editor overhead going on.

–Eric

Thanks. I’ll be sure to trust my scripted FPS counter then. Appreciate it =)