Timing an Event Based on Framerate

Hi,

I have a simple question regarding making an in-game event happen after a specified time period. Say I want to throw a grenade and have it explode after a few seconds. I know that to respect the games frame rate, I have to use deltaTime to measure the elapsed time, but when I do something like this:

if (grenadeThrowAction) {
    throwStarted = Time.deltaTime;
}

Then:

timeElapsed = Time.deltaTime - throwStarted;

The numbers I get back are extremely small. I can work around this by multiplying, but it seems slightly odd to multiply each number by a million when comparing against some threshold I’d like to set to determine how long before a grenade explodes. Am I on the right track, or is there a better way of doing this?

use Time.time instead, it should work. Time.deltaTime = time it took to render previous frame. Time.time = time since starting the application/game.

Wouldn’t using Time.time disregard framerate though? I.e., an enemy on a faster machine would likely be able to travel farther than on a slower machine before the grenade exploded. This seems like a downside as the gameplay experience woudn’t be consistent across machines.

you need to add the detlta time every frame to get a timer

timeElapsed += Time.deltaTime;
if (grenadeThrowAction) {
    throwStarted = 0;
}
if(timeElapsed == TargetTime)// <-- make an approximate comparison here
{
    //do stuff
}
else
{
    timeElapsed += Time.deltaTime;
}

Last two responses give me what I need. Thanks guys!

Although you mentioned that the answer are sufficient:

Nope. Time.time gives you the time since the game started in seconds. Meaning if you catch Time.time in one frame in variable a and then again in the next frame as variable b, then do b - a you will get the exact same result as Time.deltaTime would deliver.

That means there are different approaches to your problem. You can either increase a variable by Time.deltaTime each frame and then check if a certain limit is reached (which is shown by the other answers), or you could check Time.time every X frame, do some tiny math as mentioned above and then see if that limit has been reached.

As usual: one problem, multiple solutions. :wink:

It doesn’t matter how you measure the elapsed time. Using additive delta or the total time will both result in the grenade exploding after a variable amount of frames depending on the frame rate. So, for instance, if you set it to explode after 1000 milliseconds and you are running at 10FPS it’ll explode after ten frames. If it’s running at 30FPS, it’ll explode after 30 frames.

Neither approach “respects the frame rate.” Whether or not “respecting the frame rate” matters also depends on how you write your code.

" … an enemy on a faster machine would likely be able to travel farther than on a slower machine before the grenade exploded."

If you move the enemy a standard amount every frame regardless of delta but then use time to trigger the grenade, then yes. A one second timer would always explode one second in, regardless of whether ten or thirty frames have passed, even though the enemy would have moved different amounts in those two scenarios. This is why you have to model both events and movement off of the same approach, whether it be time based or frame based.

If you multiply the enemy’s speed by the delta every frame, it’ll move the same amount over the same period of time even if the frame rate fluctuates. If there are less frames, the enemy moves farther each frame to get the same distance overall. If there are more frames, the enemy moves less each frame to reach the same distance overall. Either way, it’ll be roughly in the same position when the grenade explodes. Just make sure all the movement in your game is multiplied by delta and any time based event system should work fine.