Out of memory and memory leaks when creating texture atlases

I’ve hit a wall because of massive memory leaks in Unity.

I’m creating texture atlases from Texture2Ds. My textures are imported as RGBA-32 uncompressed, I have 147 images which all told take around 550 MB in the library folder. While creating texture atlases from these source textures, Unity’s memory usage will grow from about 600 MB to 1.6GB on the first run, then bounce between about 1.8GB and 2.6GB on subsequent runs. After about 5-10 runs of the atlas generator, Unity will always crash with the error shown at the bottom of this post.

I am specifically taking these steps to AVOID memory leaks, but its obviously not working:

  • The source images are stored in the ScriptableObject as GUID strings, not Texture2D Object references, so the textures are not loaded automatically when the data object is loaded.
  • During atlas creation, textures are loaded one at a time through AssetDatabase.GUIDToAssetPath and AssetDatabase.LoadAssetAtPath. I specifically did this so garbage collection would be allowed to dump the source images from memory immediately.
  • I never store any source images in variables that persist, only local variables.
  • I never store any images in static variables. (I’m aware Unity doesn’t dump these.)
  • When I create the atlas textures, I first create a new Texture2D() at the right size, load one source image at a time, write the pixels to the atlas using texture.SetPixels(), then convert to PNG with texture.EncodeToPNG(), Object.DestroyImmediate(atlasTexture), write to disk with BinaryWriter, then import it with AssetDatabase.ImportAsset().
  • I am explicitly calling System.GC.Collect() after every run of the atlas generator, but it didn’t seem to change anything.
  • Resources.UnloadUnusedAssets() also does not reduce the memory footprint.

It seems to me that garbage collection is simply not doing its job, otherwise the memory of these source images would be freed up. Should I be explicitly destroying arrays like the pixel Color arrays? These are always in local variables, so they should be automatically destroyed.

Here are some tests using Profiler.usedHeapSize to check before and after atlasing.

  • 358 MB, 335 MB
  • 361 MB, 334 MB
  • 359 MB, 334 MB
  • 358 MB, 334 MB
  • 359 MB, 334 MB
  • 358 MB, 334 MB
  • CRASH

As you can see, Unity reports the heap size as fairly stable. However the memory footprint of Unity is not stable at all. During processing the heap peaks around 650 MB. Unity however gets up to 2.6GB starting at only 600MB before processing.

I did more testing using the Profiler. Now I can see it in concrete numbers. Column B, Mono Heap Size, grows with every run until it tops out around 2GB then crashes Unity. What could be causing this? Col C, Mono Used Size, also grows a bit. Is this just going to end up being a problem with Mono, meaning no solution? Oh boy. >_<

Run           A      B      C      D      E      F             
1    Before   358    4      3      358    428    70
     After    336    931    582    336    428    92
                            
2    Before   359    931    353    359    428    69
     After    335    1188   510    335    428    93
                            
3    Before   361    1188   459    361    444    82
     After    334    1444   458    334    444    109
                            
4    Before   358    1444   345    358    444    85
     After    334    1701   879    334    444    109
                            
5    Before   326    1701   511    362    444    81
     After    334    1957   586    334    444    109
                            
6    Before   359    1957   380    359    444    85
     After    334    1957   586    334    444    109
                            
7    Before   360    1957   398    360    444    83
     After    334    1957   586    334    444    109
                            
8    Before   362    1957   535    362    444    81
     After    334    1957   586    334    444    109
                            
9    CRASH                        

All numbers in MB

Columns:
A = Used Heap Size
B = Mono Heap Size
C = Mono Used Size
D = Total Allocated Memory
E = Total Reserved Memory
F = Total Unused Reserve Memory

I’m talking to myself again. This happens a lot these days on this forum.

Well, I found the source of my issue. So when the Mono Heap grows, you can be pretty sure its from allocations you’re doing, not internal Unity allocations. So I traced my explosive growth to this one bit of code which is part of the atlas maker.

// Create new texture for atlas
Texture2D newTex = new Texture2D(sizeX, sizeY, TextureFormat.ARGB32, atlasMaker.db.useMipMaps); // create new texture
newTex.hideFlags = HideFlags.DontSave;

// Clear the new texture first before we write the frames to the atlas because new Texture2D() creates an opaque white texture
Color[] clearPixels = new Color[sizeX * sizeY]; // THIS ALLOCATION IS THE KILLER
for(int i = 0; i < clearPixels.Length; i++) {
    clearPixels[i] = Color.clear;
}

newTex.SetPixels(clearPixels);

So my atlases can be up to 4096x4096. This is an allocation of 16,777,216 Color structs. If each Color takes 16 bytes (4x 32-bit), it would take 256MB of memory plus overhead. Obviously it isn’t this simple. Its clearly taking almost 4 times that much memory. Regardless, there is no reason I can imagine why this should cause 931MB of allocations on the 1st run, then progressively bloat up to 1957MB by the 5th run. This is especially puzzling considering I explicitly called GC.Collect() after each run. In my mind, there has to be a problem under the hood with Mono or Unity.

So this problem isn’t exclusive to arrays you allocate yourself. For example. texture.GetPixels() makes a huge allocation which causes exactly the same memory problem. It makes it pretty much impossible to edit textures over a certain size within unity.

Okay, so I know theres a problem under the hood. In my case, because all I was doing was clearing the texture, the “solution” (workaround) was simple. I could use newTex.SetPixel() on each pixel, but the manual says that is slower than SetPixels(). So I compromised and created a small static 64x64 clear block of pixels and use that to stamp over the image one block at a time using SetPixels(x, y, width, height, pixels). After this change, the Mono heap never got larger than 72MB or so.

Everything’s good now right? If only life were that simple.

Now I’m running into a new out of memory crash scenario, but this time nothing shows up in the Profiler as suspicious or growing. And this time it has nothing to do with Mono allocations. Rather, its internal Unity allocations crashing when loading Texture2Ds. After some runs of the atlas maker, then doing things that cause Unity to load large Texture2Ds (AssetDatabase.LoadAssetAtPath()), Unity crashes and burns. Seeing this is internal to Unity and nothing to do with Mono, I am extremely doubtful there is any solution/workaround. The weirdest thing is the fact that Unity’s memory footprint stays below 2GB through all of this – far below the 3.5GB or so effective limit of a 32-bit program.

I should reiterate that never am I keeping references to more than a few frames at a time in memory. This is not simply a case of trying to load too much data at once.

The bolded text in the editor log above show where the memory allocations are coming from:
C:/BuildAgent/work/d3d49558e4d408f4/Runtime/Graphics/Texture2D.cpp
UnityEditor.AssetDatabase:LoadAssetAtPath(String, Type)

Sounds like you need to submit a bug report.

I would, but its complicated to reproduce and the project required to show it is quite huge. Maybe I’ll submit it.

Anyway, I found a workaround. I determined the crash was coming from loading a number of large 4096x4096 textures at once which were stored as object references in the SerializedObject. When the SerializedObject was loaded, its child textures were loaded, even though I didn’t really need access to those textures at the time. There were 8 of these textures in the object, but they are compressed at 21MB each for a total of 170.4MB. I wouldn’t think 170MB of textures would be enough to cause a problem, but it seems it does, at least after all the atlas processing beforehand. (You have to switch loading between two objects several times to cause the crash, so maybe it has something to do with the way Unity unloads or doesn’t unload immediately the texture data when the object isn’t needed anymore.) Anyway, my workaround was to store those textures as GUID strings in the object instead, so they’re never loaded when the main object is loaded, only on demad with AssetDatabase.LoadAssetAtPath. Now it seems stable. Just another thing to watch out for when dealing with large textures I guess.

same issue But other unity member says reduce your model Vertices or Highres Texture so it’s work or wait for Unity 5 becoz it’s 64bits so give you more memory to your game.

i just had a go at this, packed 16x textures using wrong settings using 8096 size, crashed couple of times, in another thread aras sais the function attempts many runs with different sizes until it fits, so perhaps that is using memory.

If makeNoLongerReadable is true then the texture will be marked as no longer readable and memory will be freed after uploading to the GPU. By default makeNoLongerReadable is set tofalse.