Unity Timer.deltaTime is consistently inaccurate (reporting higher number),Time.deltaTime reports incorrect (larger) number


I have a prefab with the following script:

public class RefineryScript : MonoBehaviour {

    float timer = 0f;
    public float loopRate = 5f;
	// Use this for initialization
	void Start () {

	// Update is called once per frame
	void Update () {

        timer += Time.deltaTime;
        if (timer > loopRate)
            Debug.Log("Timer was " + timer);
            timer = 0f;

It is supposed to log every 5 seconds. However, it is logging every 3 seconds. Watch the actual log time in the picture.


I tried also to implement this in a Coroutine which yields WaitForSecondsRealtime(5f), I also get 3 seconds time delay instead of 5.

Any ideas?

I counted how many frames in 5 seconds, divided 5 by that number. I get around 300 (302, 298) within 5 seconds.

This means that the Timer.deltaTime “thinks” it is at 60 fps. However, in the stats, I am running around 111fps.

As per comments below, the issue is with timeScale. Setting timeScale to 1f or using unscaledDeltaTime solved the problem.