I am making a little tilt marble labyrinth game as my first Unity project, and I am trying to get the camera to rotate simulating the act of tilting the game table. The camera is rotated at x = 90, y = 0, z = 0). This causes it to look directly at the center of the game board beneath.
Rotating along the x axis works as expected with vertical input, but rotating along the z-axis simply spins the camera rather than rotate on its relative z axis – and I understand why. Since the camera is already at 90 degrees, rotating on the z axis will produce the effect of spinning instead of rotating.
My problem is that I haven’t figured out how to solve this. I want to be able to rotate along the x and z axises given that x = 90 is center. I’ve been trying to hours to come up with the math that would solve this but have come up short. I must ask for your help!
Here is an example of my code that produces the issue I described:
Quaternion target = Quaternion.Euler (tiltAroundX+90, 0, tiltAroundZ);
cameraController.transform.rotation = Quaternion.Lerp(transform.rotation, target, tiltForce);
The cameraController is simply the component script of the camera object. TiltAroundX and tiltAroundZ are accelerometer input angles, Pretty basic, but here I struggle. tiltForce is just an argument to control tilt speed.
Any help would be greatly appreciated, thank you!