How to set tilt offset?

I am currently making a “roll-a-ball” game with fully functional tilt controls which work in relation to the device laying flat on a surface. After much searching and trial and error of many different methods I have decided to ask for the help… my tilt control script is as follows:

public class Tilt : MonoBehaviour
{
    public float g = 100f;

    void Start()
    {
        g = g + 110;
    }

    void Update()
    {
            Physics.gravity = new Vector3 (
            Input.acceleration.x,
            Input.acceleration.z,
                Input.acceleration.y
            ) * g;
    }
}

I require to ‘zero’ the accelerometer according to the position the user is holding the device on level load, so that a starting device tilt of 45 degree will register as 0 degree tilt in the game world and so on.
I understand that I need to take the current accelerometer positions and subtract them or something but I need someone to make it pretty clear exactly what to do before I become bald :stuck_out_tongue: Thanks…

I’m sure this has been done a million times over? Thanks in advance

this is where google comes in handy…
http://answers.unity3d.com/questions/289184/gyro-quaternion-offset.html
http://answers.unity3d.com/questions/927515/accelerometer-calibration-2.html
http://answers.unity3d.com/questions/703782/unable-to-recalibrate-accelerometer-to-the-initial.html

etc.

1 Like