Calculate time per frame

Hello, i’m looking for a way to calculate time per frame. It’s not FPS which is just number of Updates per second, it time it took to render this single frame. Time per frame should be calculated in ms not in fps, so 30fps is ~= 33ms per frame per second.

So simply i want to track when main loop started to render this particular frame and when it’s done.

EDIT: I should clarify that i want to calculate time per frame independent of targetFps. Otherwise deltaTime would suffice. But when targetFps is set, for ex., to 30fps, deltaTime will always be around 0.033sec, when in reality frame could be rendered much faster. I want this stat to provide myself with insight just by how much my targetFps is lower than it could be.

EDIT2: As of right now i see i can’t get it. What i want in nutshell (meta-c).

int last_start = 0, last_done = 0; 
time_t last_frame_ms;

loop 
{ 
	int last_start = gettime(); 
	UNITY_LOOP(last_frame_ms: last_frame_ms); 
	last_done = gettime(); 
	last_frame_ms = last_start - last_done; 
}

You can use fps =1/Time.deltaTime.

Time.realTimeSinceStartup may be what you’re after. Time.realTimeSinceStartup gives you the exact time (in seconds) since your game was started. It ignores timescale and fixed framerate and comes directly from your system’s clock. This means that if you were log Time.realTimeSinceStartup at the start of your Update method and then log it again at the end, you’d receive two different values because Time.realTimeSinceStartup is (like the name suggests) in real time.

I’m unsure if Update is called more than once per frame if you limit your framerate, but if it is all you’d need to do is store Time.realTimeSinceStartup in a global every Update cycle like so:

public float timeLastFrame;

void Start()
{
	//Initialize our timeLastFrame variable
	timeLastFrame = Time.realTimeSinceStartup;
}

void Update()
{
	float realDeltaTime = Time.realTimeSinceStartup - timeLastFrame;
	TimeLastFrame = Time.realTimeSinceStartup;

	//Do whattever you want with delta time here...
}

This doesn’t really solve the original problem but it is a work around. If you determine your min spec device of the majority of your users (I’m guessing you are collecting device info), you could run the game with uninhibited FPS as a local test, then use those results to set your FPS for the release version.

With newer versions of Unity, you can go to Window > Analysis > Profiler and check all the times in ms.

Did you find a solution for this problem @genius-fx ?? I also want to find the render time for every frame.
I was thinking about using OnPreRender and OnPostRender, but you said you got weird results with it!?