Well, if you want to get technical, you can’t set it, because it’s not a variable. Euler angles are not explicitly stored. Behind the scenes, Unity uses the Quaternion for everything, and when you ask for Euler angles it converts on the fly. So when you write
transform.eulerAngles = <something>
what Unity actually does is effectively
transform.rotation = Quaternion.Euler(<something>)
and then when you ask what the eulerAngles currently are, Unity effectively does
return transform.rotation.eulerAngles
Since there’s more than one possible set of Euler angles for representing a given Quaternion, you are not guaranteed to get back the same ones that you started with. You might get a different set of angles that represents the same combined rotation.
And since Euler angles are not linearly independent, changing one component (like X) is sometimes equivalent to changing other components. For example, the Euler angles (180, 0, 0) are equivalent to (0, 180, 180). So if you set a Transform’s eulerAngles equal to one of those, and then asked Unity what the eulerAngles currently are, it could return the other one and it wouldn’t be “wrong”.
Setting just the X of the Euler angles is a really weird thing to do, because it doesn’t have any clear geometric meaning. If you set all the angles to something like (100, 0, 0), then that has a clear geometric meaning: It refers to a specific orientation in 3D space, and while Unity might not preserve those exact numbers, it will preserve that orientation. If you ask what the Euler angles are afterward, they might not be (100, 0, 0), but they’ll be equivalent to that.
But setting just the X doesn’t guarantee any particular result. Remember how (180, 0, 0) is equivalent to (0, 180, 180)? Well, once you override the X angle with 100, you could get either (100, 0, 0) or (100, 180, 180), which are not equivalent. Changing just one angle means you don’t know what you’re going to get.
So I replicated your test, but with more logging:
void Update()
{
if (Input.GetKeyDown("space"))
{
Vector3 angles = transform.eulerAngles;
Debug.Log("Before, angles = " + angles);
angles.x = 100;
Debug.Log("Modified, angles = " + angles);
transform.eulerAngles = angles;
Debug.Log("After, eulerAngles = " + transform.eulerAngles);
}
}
Before, angles = (0.0, 0.0, 0.0)
Modified, angles = (100.0, 0.0, 0.0)
After, eulerAngles = (80.0, 180.0, 180.0)
Before, angles = (80.0, 180.0, 180.0)
Modified, angles = (100.0, 180.0, 180.0)
After, eulerAngles = (80.0, 0.0, 0.0)
Before, angles = (80.0, 0.0, 0.0)
Modified, angles = (100.0, 0.0, 0.0)
After, eulerAngles = (80.0, 180.0, 180.0)
Unity is giving you back equivalent orientations, but not the exact numbers that you set. (I suspect it’s trying to follow a rule that pitch shouldn’t be greater than 90 degrees, because that’s how first-person games traditionally work.)
The rotation values displayed in the inspector for this object are doing something weird that doesn’t necessarily match the exact numbers that you set or the exact numbers that Unity gives back, and I’m not totally sure why. Could be there’s a different set of conversion rules for the inspector for some reason.
When you modify Euler angles through the inspector, Unity remembers the exact Euler angles so that they don’t appear to change. But this doesn’t apply when you modify them through code at runtime.
If you want to make a user input system where the user controls something through its Euler angles, then usually you work around this by explicitly storing the current Euler angles yourself, instead of asking Unity what they are. When you get an input to change them, you change your variables, and then modify the Transform to match your variables, and you never change your variables to match the Transform.
(Obviously, this only works if ALL rotations go through your code. If you also want the object to rotate in response to physics or something, you have a much harder problem.)