Hi there !
I’m facing an annoying issue, and I don’t know if it’s something that I misunderstood or if the implementation of the Input.GetAxisRaw function is broken.
How I understand the function :
Input.GetAxisRaw will return a value in the [-1, 1] range (I’m using an xbox 360 controller), depending on the position of the analog stick (basically, what we can seen in Windows when opening the Game Controllers window), and this value isn’t affected at all by the Gravity or Sensitivity values.
What happens :
If I have a sensitivity below 1, the value will be in the [-sensitivity, sensitivity] range .
If the sensitivity is above 1, the value returned is in the [-1, 1] range.
That seems totally broken to me, as I don’t see how to get the real raw value of the analog stick without dividing the value by the sensitivity.
For me, the Input.GetAxisRaw should only return the hardware value of the axis, without using Gravity, Dead or Sensitivity values.
I tested it on Unity 4.6.4f1 and 5.0.1f1 and the results are the same.
Thanks in advance !