Thanks for your reply Edy.
I’ve removed the multiplication with Time.FixedDeltaTime. So the line looks like this.
_rigidBody.AddRelativeForce(Vector3.forward * (finalThrust));
Now the velocity of the objects scales according to the Fixed Timestep setting. If I set Fixed Timestep to 0.01 then the object moves at half the speed it would if Fixed Timestep is set to 0.02.
This is exactly the opposite to what I’d expect.
I thought that “fixed timestep” was the frequency at which FixedUpdate would be called. Therefore every fixedUpdate finalThrust would be added to the objects velocity. So, the lower FixedUpdate is set, the more frequently force would be added to the object so, overall, more force. Therefore, if you multiplied the force by FixedUpdate, (the more frequent the update the lower the multiplier), it would cancel out and be a consistent force, no matter how frequently fixed update ran.
If FixedUpdate is 0.02 that means it’s called 50 times per second.
If force is 10 and it’s added 50 times per second you get 500 units of force per second.
So when FixedUpdate is set to 0.01 it’s called 100 times per second.
If force is 10 and its added 100 times per second then that should be 1000 units of force.
So (in my mind) the second one should be faster, but it’s not, its slower.
In my mind you should be able to multiply the force by the frequency, to end up with the same overall force per second.
10 * 0.02 * 50 = 10
10 * 0.01 * 100 = 10
But that’s not what I’m seeing. As I said, the higher the Fixed Timestep (lower frequency?!) the faster the object moves. In fact it seems to cancel out when I divide the force by the frequency. Which makes no sense to me at all.
(10 / 0.02) * 50 = 25000
(10 / 0.01) * 100 = 100000
Yet somehow, this makes the object move at a consistent speed and different Fixed Timesteps.
I must be making myself look very stupid here, I’m obviously misunderstanding something very dumb. Any idea what it is?