This may seem like an easy question, but for some reason it caused me to rethink for a moment. I’ve searched through previous questions, and none of them really explained exactly how this works. I was reading through the Unity 3D Scripting tutorial, and came across the input part:
To accept input, you simply have to get the Axis your trying to move, and the game will render your input onscreen in the form of, well, movement. This goes like this:
transform.Translate(Input.GetAxis(“Horizontal”, 0, “Vertical”);
/////
This makes perfect sense. But when it started talking about Delta Time, this was added:
/////
transform.Translate(Input.GetAxis(“Horizontal”)Time.deltaTimespeed, Input.GetAxis(“Vertical”)Time.deltaTimespeed);
So, as far as I can tell, your essentially getting a copy of the amount of time since the last frame was rendered, then multiplying it by the actual speed at which you want to move the GameObject. This threw me off. How could doing this help framerate on faster computers? If anything, wouldn’t it make the frames render too quickly?
Here’s an example:
- Player presses Right Arrow(1)
- 1 is multiplied by Time.deltaTime(let’s say, half a second for a slower computer)
- .5 is now multiplied by the actual speed you want to go by
Maybe I’m looking at this from the wrong angle, but wouldn’t this mean that the time since the last frame rendered would be greatly increased for the next rendering?
speed = 3.0
.5 * speed = 1.5
This would help slower computers, but I don’t see how it wouldn’t hurt faster ones. I’ve searched this entire site for help, but have only found questions regarding what deltaTime is, and inconsistencies while using it. I may just be overthinking this, so I’m sorry if it’s a large question if it’s only a little thing, but I suppose it’ll help anyone else who gets confused, right?