I want to align the orientation of the airplane (z axis) with a target vector over time. I don’t want to use slerp, but apply pitch (x axis rotation) and yaw (y axis rotation) every frame.
The pitch and yaw differences to target are calculated as follows:
float yawDiff = Mathf.Atan2(targetZLocal.x, targetZLocal.z) * Mathf.Rad2Deg;
float pitchDiff = Mathf.Atan2(targetZLocal.y, Mathf.Sqrt(targetZLocal.x * targetZLocal.x + targetZLocal.z * targetZLocal.z)) * Mathf.Rad2Deg;
Basically pitch diff is the angle between the target vector and the x-z plane. This angle should stay the same no matter the yaw of the airplane.
The rotation is perfomed like this every frame:
transform.Rotate(pitchValue, yawValue, 0);
The issue I’m experiencing is the difference in pitchDiff value after the first frame for different cases of the target vector: (0, 1, √2) (white line on image) and (1, 1, 1) (black line). They are the same vector rotated by 45 deg around the y axis.
In the first case only pitch is needed, so the plane is rotated in the first frame by (-0.06, 0, 0), and in the second case yaw is also needed, so the plane is rotated by (-0.06, 0.06, 0).
The starting pitchDiff in both cases is 35.26439. In the first case where yaw is not applied, the pitchDiff in the next frame comes out to 35.20439, and in the other case with yaw, pitchDiff comes out to 35.22191. Doesn’t seem like much in one frame, but it adds up enough to throw off the alignment algorithm I’m trying to implement.
Why does this happen if yaw does not affect the angle of the target vector to the x-z plane? The difference on the second decimal seems too big to be a numerical error.
