Sorry if I’m posting this in the wrong section but feel free to move it somewhere else. I walked into something that I’m confused about. I have a day/night cycle and it uses (or rather used) a timer based on Time.deltaTime. I noticed that on higher framerates the clock goes faster than on lower. When I calculate deltatime myself it solves that issue completely. Given this issue I can’t understand why all tutorials that deal with time insist on using Time.deltaTime. I thought Time.deltaTime was suitable for creating framerate independent code or am I wrong? Here an example you can try yourself. I’ve tried this in the editor and also in a build, and the build runs on a higher frame rate than the editor and made it obvious to me that Time.deltaTime is problematic.
using System;
using UnityEngine;
public class TimeTest : MonoBehaviour
{
private DateTime tp1;
private DateTime tp2;
private float deltaTime = 0;
private float timer1 = 0;
private float timer2 = 0;
void Start()
{
tp1 = DateTime.Now;
tp2 = DateTime.Now;
}
void Update()
{
CalculateDeltaTime();
timer1 += Time.deltaTime; //Timer goes slower on low frame rate, quicker on high frame rate
timer2 += deltaTime; //Timer runs like you'd expect from a clock no matter the frame rate
}
void OnGUI()
{
GUILayout.Label(timer1.ToString());
GUILayout.Label(timer2.ToString());
}
void CalculateDeltaTime()
{
tp2 = DateTime.Now;
deltaTime = (float) ( (tp2.Ticks - tp1.Ticks) / 10000000.0 );
tp1 = tp2;
}
}


