How to calculate an average frame rate

I made a simple FPS monitor script. It displays the current frame rate, as well keeps track of the lowest and highest numbers. I plan on keeping track of the average FPS for each level. My question is how do I calculate the average frame rate?

It’s called Moving Average. The cumulative type of moving average is what you probably want. It’s the average of all values up to the current one.

A brute force approach is to collect all fps values since the beginning and continually making an average of all of them. But through math, you can simplify the formula such that you don’t need to collect the values anymore. You just keep the current average and how many have been averaged so far, and use some math to get the new one.

Check the wikipedia article linked for details.

Take note you still need to compute the FPS yourself using other means. What this does is only make an average of all FPS values so far.

int qty = 0;
float currentAvgFPS = 0;

float UpdateCumulativeMovingAverageFPS(float newFPS)
	currentAvgFPS += (newFPS - currentAvgFPS)/qty;

	return currentAvgFPS;

This is how i do it, it is far easier than anything else, no cumulative stuff, just one float

float avg	= 0F; //declare this variable outside Update
//run this in Update()
  avg += ((Time.deltaTime/Time.timeScale) - avg) * 0.03f; //run this every frame
  float displayValue = (1F/avg); //display this value

The good thing about this is that past values gradually fade out of importance, so it is not to jittery and not to slow

An approximate calculation can be done with:


EDIT: Well, this is not approximate at all, this is the value. If you have been through some oscillations and waves topics, deltaTime is T or period which is the time between two peaks of the wave calculated in second. fps is the frequency f which is the amount of peaks in 1 second calculated in Hertz(Hz) and the relation with T is:

f = 1/T;
T = 1/f;

In our case a peak is a frame starting.