Hi,
i’m making a game for android, where you are controlling a car from above and by rotating the phone you rotate the car in the other direction, so that it looks like the car is always facing the same direction in real world. I’m using the gyroscope for this purpose.
Input.gyro.enabled = true;
Input.gyro.updateInterval=0.01f;
Problem is, that both methods i was trying to use are really inaccurate.
First i tried to add the rotationRate to the current rotation.
player.transform.localEulerAngles += new Vector3 (0, Input.gyro.rotationRate.z, 0);
When i start the game, the player will rotate faster then it should. When i turn my phone 100° the player performs a full 360° turn.
Maybe im missing out on some reference to the framerate. So i tried getting some multiplication value, by dividing Time.deltaTime with the updateInterval.
player.transform.localEulerAngles += new Vector3 (0, Input.gyro.rotationRate.z, 0)* (Time.deltaTime/Input.gyro.updateInterval);
But that made the car rotate even faster and out of controll or if i switch updateInterval and deltaTime it doesnt even move.
My other approach was setting it directly to the rotation of the device.
player.transform.localEulerAngles = new Vector3 (0, Input.gyro.attitude.eulerAngles.z, 0);
That works for a short momentan, but then the results are getting incredibly inaccurate, having the car skip more than 10° on fast movements.
I dont quite know wich of those is the best solution but since adding the rotationRate proved to be the smoother option i would like to try it that way, but i dont know with what to multiply it to my desired result.
Thanks in advance for any help,
FlomoN