Hello! I’m working on a timer for my game that’s effected by Time.timeScale
:
float diff = Time.deltaTime * Time.timeScale;
if (countDirection == CountDirection.Down) currentTime -= diff;
else currentTime += diff;
Further, for testing, I have Input code for setting the timeScale to different values:
if (Input.GetKeyUp(KeyCode.O)) Time.timeScale = 2f;
else if (Input.GetKeyUp(KeyCode.P)) Time.timeScale = 0.5f;
else if (Input.GetKeyUp(KeyCode.I)) Time.timeScale = 1f;
However, when Time.timeScale
is either 2f or 0.5f, the timer is effected by a factor of 2.
ex. 2f = 4 times faster, 0.5f = 4 times slower.
I confirmed this with manual testing on my phone’s stopwatch.
I’m not quite sure why this would be happening? Does anyone have any thoughts?