I’m sure this is a very entry-level problem, but I’m having a lot of trouble getting my head around the classic time.deltaTime, and specifically in my current use case.
I understand, functionally, that it is the time between the last frame and the current one in real time (seconds).
I understand also that it’s necessary in order to ensure the code runs correctly regardless of the framerate.
I got as far as knowing it’s particularly useful in the timeframe for .Lerp.
What I can’t get my head around is figuring out how to use it to get the right timings on things.
My specific use case is the following: I’m trying to (smoothly) move a child object to a specific position when triggered by the parent. It needs to stop exactly when it has reached that position. I know I can - and probably will - do that by tracking the position and toggling a bool off once it reaches it, but I figured it’d be a good point to try and grasp the deltaTime function.
For the record, my initial Lerp below:
transform.position = Vector2.Lerp(transform.position, new Vector2(transform.position.x + ((playerCollider.size.x - changePosAdjuster) * transform.localScale.x), (transform.position.y + playerCollider.size.y) - changePosAdjuster), Time.deltaTime);
Now, I need this to move at a speed that means it has exactly reached the target when the toggle is switched off. The toggle switches off at an event 8 frames into an animation. So:
- Animation starts.
- Movement script runs for 8 frames.
- Animation stops.
- Movement script stops.
- Movement script should have moved to the exact position described in that new Vector2.
In my head, that meant that I had to either have Time.deltaTime * 8
or Time.deltaTime / 8
, but both come out badly wrong.
What logical jump am I missing in terms of how deltaTime fits into all this?