**Question**: I have two transforms in my scene. The first one is my camera, the second one is a spaceship. Both of these can move and rotate freely (6dof) in the scene. I need to get the pitch/yaw difference between both transform.rotations, ie. the angles at which I’m looking at the ship.

**Examples**: If the spaceship is facing straight away from us and we’re right behind the ship’s tail, the yaw should be 180 deg, pitch 0 deg for example. Or if the ship is facing left (relative to the camera) and we’re seeing it from the top-down (ie. we see its ‘roof’ only), the yaw and pitch would be 90 deg both.

**Attempt**: I tried the following after reading this:

```
int CalculateYaw()
{
// get a vector from the ship to the camera
Vector3 shipToCamera = Camera.main.transform.position - transform.position;
// get a numeric angle for each vector, on the X-Z plane (relative to world forward)
float angleA = Mathf.Atan2(transform.forward.x, transform.forward.z) * Mathf.Rad2Deg;
float angleB = Mathf.Atan2(shipToCamera.x, shipToCamera.z) * Mathf.Rad2Deg;
// return the signed difference in these angles
return (int)Mathf.DeltaAngle(angleA, angleB);
}
// same for pitch along Y-Z plane
```

But the code doesn’t work to my expectations. When the camera is at 0/0/0 and the spaceship at 0/0/30 in the scene and I let the spaceship rotate along its up-axis, the yaw behaves as it should (ever incrementing) but the pitch jumps between 0 (where it should be at) and 180 - being at 180 whenever the ship is facing away from the camera and 0 when it’s facing toward me. I suspect this is due to the angles being relative to the world forward when they should be relative to the spaceship transform forward.

And this is where I’m lost, any help/insight would be greatly appreciated!