At the time of Unity 5.0 betas I did a stress test:
There were 441 objects in a scene (houses), 121 colliders per each (box colliders and simple mesh colliders), so there were ~50000 colliders in total. Adding a new rigidbody to that scene in runtime was causing short (0.5-1 sec) but noticeable CPU spykes / FPS drop.
The solution I found was to bake the colliders of individual houses together, so that (441 * 121) box and simple mesh colliders turned into 441 complex mesh colliders.
The conclusion was: the number of individual colliders in the scene matters, the complexity of colliders - not so much.
From experience where 1 unit = 1 meter world scales, you should reset your origin before reaching 10,000 Units from origin. This is basically where numbers begin to crap out at a noticeable scale, given 1 unit = 1 meter.
The precision will start to show issues at this point, with a typical 3rd person or first person game.
edit: just noticed the other post mentioning it, sorry
Yeah, 1u = 1m, always. Unless it’s a project with “special needs”.
Math, in this context, not being my strong suit. I don’t really understand how much of a relative divergence physics would have at 10.000 units.
So it’s good to have some general guidelines, that and combined with a few nights of failure (eg. testing). Should be sufficient to avoid most pitfalls.
I wonder how they deal with this in games like “Just Cause 3”, with a map size of ~640.000 meters… not that my ambitions is anywhere near this number.