Unity Coordinate System

I don’t know in which sub-forum this would best fit since it seems to affect so many different topics. But my actual question is about Unitys coordinate system and how they are are changing. Sometimes it looks that the coordinates seem to be influence by the scalar values added to the same object sometimes not. it has a weird behavior with which I am struggling with. I recently wanted to replace a so called Empty Object that I just created as kind of a Folder in which I group up different other sub-object. I guess you know what i mean. Anyway, when I replaced it by dragging on the Empty Objects coordinate system the sub-components where replace kind of accordingly. I say kind because I am not anymore sure if it really was correct or if I just assumed it was. how ever when I replaced it back towards it originally was by just typing in the coordinate value everything was messed up. the sub-components coordinate systems experienced a different distance. stuff like this happened often to me so I wonder what I am doing wrong. And not I did not mess up by typing in the coordinates into the sub-components local coordinate systems. I even can recreate it. I take the same object, drag it along an axis and then replacing it by typing in the coordinate “et voila” the distance between the place where the sub-components should be and where they are now even more increased. its really confusing and I don’t see a reason why even should be like that, I somehow understand why a scalar could interact with the coordinates but then it should be in both directions but then I don’t see a useful need of a scalar at all if it makes object behaving that chaotic.
I would wish the coordinate system would be more intuitive.

X is right, Y is up, Z is forward.
It is a left-handed coordinate system.

Internally transform is stored as a combination of a vector and a quaternion. Euler angles you see are provided for convenience and are not stored internally.

Coordinate values you see are local and are stored in parent space.

Scaling affects children. Meaning if you have one child at 0, 0, 1 and another at 0, 0, 0, but set scale factor of 2 in parent object, their actual coordiantes in the world space will be two units apart.

I assume you got confused by parent-child relationships and local/world coordinates.

Child object positions (in the Transform component) are always relative to their parent. For example, if the child position is 1,1,1 it is offset by +1 on every axis from its parent, it does not mean the child is at world position 1,1,1 except for the case where the parent’s position happens to be 0,0,0.

ok I got you some screenshots. What you see is a coordinate system that is the one of my GameObject “VR”. Then you see a blue dot which is a dummy model of my controller that is acctually placed at local coordinates (0,0,0) of my VR Object. (sorry i had to zoom out a bit to increase the effect thats happening). Since I already had to replace my VR object several times, my controller model is already offset in the first picture so dont wonder about that.


this picture shows you the “original positioning” of controller vs coordinate system. top right corner z-Coordinate = -3


this picture shows you the state what happens when I drag my VR object on the Z-axis/arrow along the Z-direction to place it at a new location with only Z coordinate changed. You already see that the controller has a new relativ position to the coordinate system in which it is placed.


here you see the outcome what happens when I just reset the Z coordinate of my VR Object by manualy typping in “-3”. so it should be back at its original position. You can see, neither of both are at its original position. In case of the coordinate system I somehow can understand it, since it should be at the center point of the whole group, and since the objects of the group are diverging in some weired way I dont understand, the new center point might be at a new position…even I dont agree that this should happen, I could still understand it… However, that the sub-Objects start to diverge when its parent object experiences movement through space makes absolute no sense to me. And whats even more terrible is that … ok movement does divergence… but why the H… is that not reset to the original state when I reset the coordinates to the original state???

Dont forget. The only thing I am interacting with is the Parent Object. I click on Parent. Drag Parent. Reset Parent by typing in coordinate. nothing else happened in between.

well dragging the VR Object along the blue-colored and therefor Z-coordinate-indicating arrow also changes the X coordinate of the VR Object. Resetting would also mean that I would have to reset the X coordinate. But I still not get why this should cause a divergence.

8759194--1187296--upload_2023-1-26_17-59-45.png

A post I wrote in the past, which some people have found useful: What is Transform.forward? The fundamentals.

Ok I like auto-magically… because thats what it does. But if I understand you right in your conclusion, then It also that what I expect. But my screenshots are not showing this behavior. Maybe I add you here an other one:


my GameObjects coordinate System (your Ogre) and its subcomponent (The hat) are place at the exact same location when the coordinate of my GameObject is (0,0,0).


now I only change the X,Y,Z coordinate of my GameObject (Ogre). What I expect now is, that its Hat, which is a subcomponent of that GameObject will be at the exact same new Position. I am not changing any Relative positions between Parent and Child.

Im happy that this is just happening to the models of my controllers…at least I hope so… since the controllers will be auto-magically placed anyway in my hands as soon as I hit the play button. but repositioning an object in the editor view like that is very unpleasing for me.

Just a few thoughts, not sure what may or may not apply:

  • First and foremost: check and double-check that you do not have any (editor) script running that changes the coordinates of any of the involved objects. Especially if this is hooked up to VR controllers where the controller might add some “noise” to the position even while you’re not actively using it. To me this seems the most likely issue.
  • Select an object, press ‘F’ - this will reset zoom and refocus on the selected object. If you “zoom” with the mousewheel you actually mess up the view frustum, and thus end up clipping meshes eventually. Not sure if this would affect the observed behaviour though.
  • Check whether you have orthographic or perspective camera. Maybe this messes with perception? But it shouldn’t make a difference in coordinates.

The reason why I mentioned somewhere in the beginning that it my be caused by some scaling factor is that when I reset the scalar parameters of the controllers model from (3,3,3) to (1,1,1) this effect is not happening. at least in a larger area not. If I leave some invisible boarders the effect happens also.


in side that square its not happening when the scalar of the model is set to (1,1,1).

I hope not that it is related to a VR script, since I am using only the standard XR Interaction asset. I have no own script. And all my other scripts that are not linked or in any relation to the controller are not manipulating a single coordinate. And while this effect is occuring, the controllers are turned off anyway. so there shouldn’t be noise by them.