Input.GetAxis(...), Isnt it meant to be framerate independent?

Hey guys,

I’ve been working on a weapon system for a while as a learning exercise. Ive come across a small issue. After reading the Script Reference on Input.GetAxis(…) it says it is already framerate independent and does not need to be multiplied by Time.deltaTime. The issue im getting is that whether I multiply it by Time.deltaTime or not it is still getting effected by the framerate.

This is a barebones version of my weapon sway function which is called from the Update function.

mouse_x = Input.GetAxis("Mouse X") * moveAmount;
mouse_y = Input.GetAxis("Mouse Y") * moveAmount;
Vector3 target = new Vector3(currentPosition.x + mouse_x, currentPosition.y + (mouse_y * 0.2f), currentPosition.z);
transform.localPosition = Vector3.Lerp(transform.localPosition, target, smoothSpeed * Time.deltaTime);

I have also made a small demo showing the effect. (Press F1 - F4 to lock the framerate at different amounts)
https://dl.dropboxusercontent.com/u/108986232/Win32.zip

Script Reference page.

Ive been scratching my head on this for a while and would greatly appreciate any help you guys can give me or tell me i’m stupid and missing something. (Probably the case :smile:)

Cheers

  • Matt

Anybody able to help out with this?

I dont really understand your problem. If you move your mouse 5 px per frame, that should be the value you get. if you mean the internal interpolation of GetAxis; then i cant help, but you could just use GetAxisRaw and do your own interpolation.
But i suspect what you really mean is that the lerp looks wrong. That may be due to lerping from localposition to target. It will slow down towards the end and the less frames you have the more noticable it becomes.
If so, you could try to replace it with MoveTowards()