In a script I have to “parent” and “un-parent” GameObjects in a scene while changing position and rotation. Currently my code looks something like:
transform.parent = null;
transform.position = whateverVector3;
transform.rotation = whateverQuaternion;
transform.parent = whateverTransform;
So I’m clearing the parent so it’s no longer a child of any Transform, then setting the position and rotation, then setting the parent of the object to whatever new Transform I want. This is so that I don’t get unexpected scales when it’s set as a child of other Transforms that might have different scales.
That works fine, but here’s the problem. When my GameObjects get set as a child of certain other GameObjects, they will sometimes (not always) have a much larger scale than they should. It’s hard to identify the problem because the scale in the inspector shows what it should be.
In other words, lets say the transform should have a scale of 1. While the game is running, the transform component in the inspector says that the scale is in fact 1. But it’s not. The interesting thing is that when I scale the object manually in the scene, I can just drag the scale handle the slightest amount, and the actual size of the object in-scene will go back to what it should be - in this case a scale of 1. So I went into my script and added this line of code after the position and rotation are set, and before the parent is set:
transform.localScale = transform.localScale;
And now it works flawlessly. So what I would like to know is - what is going on? Has anyone seen this before? Why does the size of the object show up in the scene differently than the actual scale until you set the scale to something - even it’s own value? I guess there’s some function internally that sets the size of the object in-scene to the values in the scale - and that that function didn’t get called somehow? Anyway I’d like to know what’s going on.
Hope this all makes sense.