Is there ANY way to know via UNITY api when an actual frame draw occurs?

PROBLEM: Unity has minimum timestep! default= 3fps. ALL Update() etc. are called 3 times per second even if screen doesn’t actually update. Minimum timestep is handy as it prevents many tricky errors when deltatime>1. The following ALL run at minimumTimeStep even though the screen CLEARLY does not render:

Update()

OnPreRender()

OnPostRender()

WaitForEndOfFrame()

Time.totalFrames

So, how can one determine an actual Frame Draw since these can all run more often?

Anyone have a proper source to know the ACTUAL rendering of an actual frame?

Example Problem 1, If something in code, or graphics overwhelms your project for 2 whole seconds, unity will still report that as having been 6 frame draws even though 0 actually occurred, very difficult to track these critical performance spikes and report accurately.

Example Problem 2 if Unity minimum timestep < current frame rate, the Frame rate will be inaccurate.

Looking for a way to determine actual frame draws.

Suggesting folks to code for smoother execution and higher FPS is NOT an answer to this, Thanks!

EXPLAINED VIA CHALLENGE:
Try to make your own FPS counter that correctly states FPS when you leave default minimum timestep of .33333 and Application.targetFrameRate = 1 (or 2)

instead you will always get 3, Even though you will see an example animation on screen (spinning cube, etc) will only redraw 1/sec

I simply want a way to report the correct actual frame draws. so wondering what API can be checked to know this.

What do you have as proof that there is a ‘minimum timestep’ and that Update is called?

How did you test to get to this conclusion?

Because as far as I know Unity is still mostly single threaded… And if you do some code that takes 10 seconds, Unity won’t call Update, nor render for 10 seconds. And if you do while(true) without a break or return, Unity will loop that forever and never call Update anymore…


Link to the execution order


There is however the fact that Time.deltaTime seems to be capped at 0.3333333 etc. Which I think is what you mean with your problem. But that doesn’t mean that Update was called, nor that something was rendered. It just means that Time.deltaTime is wrong if a frame takes more then 0.333333 seconds.

Run this example code, click once, and look in the console, console should speak for itself.

public class Example : MonoBehaviour
{
	public int count = 0; // This is how often Update was called, and how often unity renderes
	public System.Diagnostics.Stopwatch watch;

	private void Start()
	{
		watch = new System.Diagnostics.Stopwatch();
		watch.Start();
	}

	private void Update()
	{
		watch.Stop();
		long millis = watch.ElapsedMilliseconds;
		watch.Restart();


		count++;
		Debug.Log(count + " : " + Time.deltaTime + " : " + millis);

		if (Input.GetMouseButtonDown(0))
		{
			Debug.Log("sleep");
			for (int i = 0; i < int.MaxValue; i++)
			{
				// loop a long time
			}
			Debug.Log("awoken");
		}
	}
} 

So the answer depends on what you want to do. The count in above example keeps track of how often Update was called, and therefore how often something is rendered (OnPreRender instead of Update bringing you closer to the actual point where rendering happens). But my guess is that you want to track time it takes for one frame, for that you can use a StopWatch as I did in the example to overcome the fact that Time.deltaTime not correct in these edge cases where you run at less then 3 frames per seconds.

So, Try to make your own FPS counter that correctly states FPS when you
leave default minimumtimestep of .33333
and
Application.targetFrameRate = 1 (or 2)

instead you will always get 3.

you asked this before. and we gave reccomendations that your game shouldnt be running at 3 fps in the first place but oppologies… we didnt give an actual answer so anyways here is how:

if you have something running at less than 3 frames per second your going to get inaccurate results because of the choppiness. unity is single thread and will only register the deltatime measurement when it gets to the code to measure it.

FPS is the amount of times unity gets around to running through all the code.

if you use Time.Time which takes a measure from the system clock and count the frames passed yourself. using a higher number of seconds and taking averages for your readings you will get a more accurate result.

	public float sampletime = 10; 
	public float fps = 0;
	int framecounter = 0;
	float timer = 0;
	void Update(){
		framecounter++;
		if(Time.time-timer>sampletime){//check if ten seconds has passed

			fps = framecounter/sampletime;//devide by ten to see how many frames have passed per second
			print ("FPS:"+fps);
			timer = Time.time;framecounter = 0;}


	}

Ok once and for all, Unity does not execute Update more often than your actual frame update of the screen. However in your original question you haven’t mentioned that you set targetframerate. The targetframerate artificially reduces the framerate, however the original statement still is true, Update is always called in sync with the actual rendered frames.

There are several issues you face here and things you’re interpreting the wrong way. As we already mentioned in several comments, Time.deltaTime is capped at 0.33333 ms. However that is only the deltaTime value. If you have a framerate lower than that, the deltaTime value will simply be wrong since it’s capped. So yes deltaTime will report at max a value of 0.33333 even when your actual framerate is 1 fps. So Update is called only once per second and you get only one frame per second, but deltaTime will be 0.33333 instead of 1. This is true for any actual framerate lower than 3 fps.

You can read Time.unscaledDeltaTime which is not affected by the timeScale and also isn’t capped or ensured that it’s greater than 0. It will report a delta time of about 1.0 if you have a framerate of 1 fps.

Finally note that (just as menioned in the docs) when vSync is enabled the targetframerate setting has no effect. I used this test script and get the exact behaviour as expected:

using UnityEngine;

public class TargetFramerateTest : MonoBehaviour
{
    Vector3 pos;
    void Start()
    {
        QualitySettings.vSyncCount = 0;
        Application.targetFrameRate = 1;
        pos = new Vector3(-10, -10, 0);
    }

    void Update()
    {
        Debug.Log("Time: " + Time.time + " RealTime: " + Time.realtimeSinceStartup +
            "

deltaTime: " + Time.deltaTime + " unscaled deltaTime: " + Time.unscaledDeltaTime);

        var go = GameObject.CreatePrimitive(PrimitiveType.Cube);
        go.transform.position = pos;
        go.transform.localScale = Vector3.one * 0.2f;
        pos += new Vector3(1, 1, 0);
    }
}

It prints one debug log in the console each second. So Update is clearly called once per second and we get one frame per second. This can also be confirmed by watching the spawned cubes. You get one cube per frame. So Update clearly runs once per frame. So your original statement is not true and you clearly haven’t actually tested it. Again deltaTime does not return the true delta time for framerates below 3 fps. Also keep in mind that Time.time is the game time which is based on the Time.deltaTime value. So if you force a targetframerate of 1 fps your time will run 3 times slower.

We have no idea what you actually want to achieve since you haven’t mentioned anything. However i can’t see anything wrong and can’t say more about this case.