# Time Clock Issues

Hello everyone I don’t know why but some reason I have two timers that are running off of time.deltaTime and one of them is faster than the other, there ran at different times.
Heres code for each.

First Timer

``````Time.timeScale = 1.0f;
seconds[R,0] += Time.deltaTime;
//Debug.LogWarning(seconds[R,0] +" and "+Time.deltaTime);
if(seconds[R,0] == 60)
{
seconds[R,0] = 0;
minutes[R,0]++;
}
if(minutes[R,0] == 60)
{
minutes[R,0] = 0;
hours[R,0]++;
}
``````

Second Timer

``````Time.timeScale = 1.0f;
seconds[R,1] += Time.deltaTime;
//Debug.LogWarning(seconds[R,1] +" and "+Time.deltaTime);
if(seconds[R,1] == 60)
{
seconds[R,1] = 0;
minutes[R,1]++;
}
if(minutes[R,1] == 60)
{
minutes[R,1] = 0;
hours[R,1]++;
}
``````

First Timer works great. Second timer runs twice as fast, found out because I divided the increment by 2. Wondering whats causing this issue or is it a bug.

This

``````seconds[R,0] += Time.deltaTime;
if(seconds[R,0] == 60)
``````

is actually a really serious bug you’re producing there. Do never, ever compare floats with equality, as they mostly aren’t equal due to how floating-point math works inside a processor.

``````seconds[R,0] += Time.deltaTime;
if(seconds[R,0] >= 60)
{
seconds[R,0] -= 60;
minutes[R,0]++;
}
if(minutes[R,0] >= 60)
{
minutes[R,0] -= 60;
hours[R,0]++;
}
``````

(Probably not necessary for your minutes/hours as these seem to be integers)