Hi,
We are currently developing a game for iOS/Android and are experiencing memory-related crashes with our builds on 256Mb Apple devices (iPod Touch 4 being particularly bad). We are also finding that some iOS versions crash much more readily than others (iOS 6.0+ being the worst offender).
The crash occurs after 15-20mins of play and is highly dependent on the number of scenes that we traverse/load in a session. Obviously we have profiled extensively using the Unity Profiler (on the device) and have found no significant memory leaks. After using the XCode profiler however we do see that the number of allocations made by the game increases markedly with each new scene that we load. When our game is killed by the OS we have approximately 160K 16-byte allocations and 110K 32-byte allocations. Each scene load increases each of these values by approximately 10,000.
When looking at the Call-Tree in the ‘Allocations’ instrument, we find that two-thirds of these small 16/32 byte blocks is coming from within Unity, under a function called ‘CreateMonoScriptCache’. This leads us to believe that Unity is caching script data/metadata/etc but never releasing it. It does seem that scenes with more scripts increase the number of allocs per scene-load proportionally.
Does this sound like expected behaviour from the Unity Engine or are we doing something wrong that may be preventing this cache from being cleared?
Many Thanks,
Adam
Unity version : 4.1.0f4
Xcode : 4.6.2