I am currently making a “roll-a-ball” game with fully functional tilt controls which work in relation to the device laying flat on a surface. After much searching and trial and error of many different methods I have decided to ask for the help… my tilt control script is as follows:
public class Tilt : MonoBehaviour
{
public float g = 100f;
void Start()
{
g = g + 110;
}
void Update()
{
Physics.gravity = new Vector3 (
Input.acceleration.x,
Input.acceleration.z,
Input.acceleration.y
) * g;
}
}
I require to ‘zero’ the accelerometer according to the position the user is holding the device on level load, so that a starting device tilt of 45 degree will register as 0 degree tilt in the game world and so on.
I understand that I need to take the current accelerometer positions and subtract them or something but I need someone to make it pretty clear exactly what to do before I become bald Thanks…