Our game has memory problems when run for an extended time and we suspect a memory leak having tested it with Xcode’s instruments.
Overall, we have a small front-end scene which loads in the game scene. When this level is completed (or the player quits), the front-end scene is reloaded.
We thought this reloading of the front-end scene would destroy all objects and effectively leave the device with the same memory as when the front-end was first run.
However, when we use the Xcode Object Allocation tool we see the net allocation gradually increasing each time the front-end scene is reloaded.
Oddly if we repeatedly load the same scene from the front-end and then return to the front-end scene, the memory does seem roughly consistent. But, each time we load a different scene and return to the front-end, the net allocation has increased (by several 100k each time).
Obviously, over a period of playing different levels we eventually get memory errors and the game crashes.
So, our question is - does Unity not really destroy everything form a prior scene when Application.LoadLevel is called? What things can a scene/script do to leave something that must be manually destroyed? We currently do call Application.GarbageCollectUnusedAssets and System.GC.Collect each time we load a game level.