I’m trying to make a 2D mario galaxies style platformer where there are a ton of small planets and the player falls towards the nearest planet.
What I have now simply checks what distance the player is from all the planet objects, selects the closest, and has the player’s gravity set to the center of that planet.
This should work, but for planets with large radii, the player almost never can get close enough to the planet, so i want to subtract the radius of the planet.
I’ve tried accessing CapsuleCollider.radius but it always returns 0.5 and transform.localScale.x / 2.0 also doesn’t return the right value.
A CapsuleCollider's radius will always be .5, and the height will always be 2, unless you change them yourself. (I think...)
It also scales with the object. In other words, your object's scale determines the "real-world" scale of the collider. Whether your object scale is 1x1x1 or 10x10x10, the collider's radius will still be .5, even though the collider encompasses different amounts of space.
You've probably got the right idea using localScale instead, assuming you're using localScale to manipulate the size of the planet(s).
As an alternative, consider adding a second game object as a child with its own capsule (or sphere?) collider, and use that one for your collision check instead.
Or, if you're working with spherical areas for planets anyway, consider using a simple distance check instead of colliders. Might be easier.