basically i put this script inside a gameobject and it moves left and right but then i do. transform.position = pos*Time.deltaTime; and it does not work as expected, but the script works fine without it but why did the script behave the way it did?
the script without Td:
public class test : MonoBehaviour
{
bool patrollingLeft = false;
float x;
Vector3 pos;
void Start()
{
pos = transform.position;
}
void Update()
{
if (patrollingLeft)
{
x--;
if (x == 0) patrollingLeft = false;
}
else
{
x++;
if (x == 100) patrollingLeft = true;
}
pos.x = x;
transform.position = pos;
}
}
Without Time.deltaTime, x is effectively moving at 1 per frame. Which makes it framerate dependent.
(It’s 1, because your using x as a position, and incrementing by 1. Your increment has then become your rate.
To create motion that’s not framerate dependent, you need the increment to be an amount per second.
This is where Time.deltaTime comes into play.
If the time between frames was 1 second, then Time.deltaTime would be 1. But normally, you’re going to be running at many frames per second, and so Time.deltaTime is going to be much smaller.
In the case of 60 fps, it would be about .0167
If your rate is to mean 1 per second, then every frame it gets multiplied by .0167, moving a small amount per frame, but over the course of the whole second, it should be about 1. Your getting the fraction of your rate based on how much time has passed.
So if you want to change your code to use Time.deltaTime, you need to apply it to the rate. Not to the position.
You would need to increase the increment to account for the different rate scale, give your new increment a var (like speed) and then do something like this:
x -= (speed * Time.deltaTime);
You’d then need to change your if (x == 0) to if (x <= 0) because x won’t likely land directly on zero.
Yeah i understand most of what you said but i do not understand the part you said “you need to apply Time.deltaTime to the rate not the postition” what exactly do you mean?
EDIT:
I get what you mean now so you saying right now i am doing 1m/frame after Time.deltaTime
its 1m/sec so i should do x-=(1time.deltaTime) 1 is the speed variable which you did so
you are basically saying the value being multiplied by time.deltaTime should be a constant instead i multiplied to a changing var which is bad. So x-=(1Time.deltaTime) is the rate.
You don’t want to multiply your position (x) by Time.deltaTime. This will give strange results.
As it so happens, your results probably aren’t as wonky as they could be, but only because of where your transform is paroling (0-100). If you were paroling from 200 - 300, you’d probably see the problem as a little more obvious when your transform jumps to something like x=4 when starting.
Instead, Time.deltaTime should be multiplied by the desired movement speed per second. And that resulting number will be your current movement per frame to apply to x (even when framerate changes).
Now as your game will run Update many times per second (60 FPS for example means sixty frame every second)
if you just did for example.
position.x += speed;
A game running at 60 fps will move the x position by 600 units every second (10 * 60) but if the game was running at 80 FPS the x position would move by 800 units every second.
This is not the 10 units per second we want.
To do this we need to multiply our speed by deltaTime is the time since the Update was called. This of course will be less than 1 (unless you have less than 1fps…) which will have the effect of making speed a very small number each frame.
position.x += speed * Time.deltaTime;
But over 1 second you will make the x increase by 10. This is completely independent of FPS - It could take 200 update calls (200FPS) or even just 10FPS (10 Update calls).
Yeah dude i totally get what you are saying i was multiplying the position by Time.deltaTime not the rate of change in position and that is where i went wrong.