I seen some answers here and they use division by Time.deltaTime. Also Unity documentations shows an example with “Input.GetAxis(“Horizontal”) * Time.deltaTime”.
At the same time documentation says “This is frame-rate independent; you do not need to be concerned about varying frame-rates when using this value.”
So which one is it?
I need to calculate mouse speed but not sure if it’s going to be hardware independant.
The documentation is correct on both counts; it’s not a matter of “which one is it”. GetAxis(“Horizontal”) by default is keyboard input and has nothing to do with mouse movement. GetAxis(“Mouse X”) is mouse movement and is always inherently framerate-independent: if you move your mouse two inches, it takes the same time for you to do that regardless of how fast your computer is. (If the framerate affects your own physical movement speed, I would start to worry. ) So you should absolutely not use Time.deltaTime with mouse movement, since that will cause the code to become framerate-dependent.
Time.deltaTime is 1 second divided by the frame rate. So no matter what frame rate you currently get if you multiply by Time.deltaTime the speed will move according to that value.
Let’s take some extreme values to show what I mean:
very slow computer with only 1 frame every 1.5 seconds, your speed (say that’s 6) so your speed would move 6 units times by 1.5 = 9.
fast computer 60 frames per second speed = 6 times (1 / 60) or 6 times 0.0166666 which equals 0.1 but that’s just for 1 frame. At 60 frames a second the same 1.5 seconds as the previous would be 90 frames so 90 times 0.1 = 9
no matter how many frames you get the movement is always the same speed and therefore frame-rate independent.
Hope that makes sense.
EDIT for clarification the mouse X is not of itself frame-rate independent but by multiplying movement by Time.deltTime it is made so.