WWW Android Memory Leak?

Hello,

I am developing a Gear VR application that must download a number of large video files. I’m finding that after downloading roughly six or more ~100 MB files, the app behaves as if it is running out of memory-- failing to load assets or crashing on scene loads. If the app is rebooted, it does not download the files again (as it finds them in persistentDataPath) and behaves as expected. I have restructured the code several times, either utilizing using() blocks, manually calling www.Dispose(), and aggressively calling System.GC.Collect(), but none of that had any effect on the behaviour.

It’s worth noting that this problem only occurs on the Note 4. S6s are able to proceed without issue, I’m presuming because they have enough spare memory to operate correctly even with the leak. It’s also possible that WWW is not at fault and something else (maybe System.IO.Filestream) is the root of the problem.

Also of note: the same problem was observed in a previous version of the app that used WWW to transfer the files out of StreamingAssets via a jar:// URI, rather than fetching the files from an external server.

I’m using Unity 5.2.2.p4. I’ve found mentions of a WWW memory leak on iOS, but that was presumably patched several versions ago, and is an entirely different platform in any case.

Code follows, trimmed for clarity:

    private void Start()
    {
         StartCoroutine(AllDownloads());
    }

    private IEnumerator AllDownloads()
    {
        foreach (VideoInfo video in mVideos)
        {
            yield return StartCoroutine(Download(video));
        }
    }

    private IEnumerator Download(VideoInfo video)
    {
        string writePath = Application.persistentDataPath + "/" + video.Name;

        WWW www = new WWW(Host + video.Name);
        while (!www.isDone)
        {
            yield return new WaitForSeconds(0.1f);
        }

        if (!string.IsNullOrEmpty(www.error))
        {
                //Error Handling Code -- snipped - Loads a different scene, destroying this object.
                yield break;
        }

        System.IO.FileStream file = System.IO.File.Create(writePath);
        int written = 0;
        byte[] bytes = www.bytes;
        while (written < www.bytesDownloaded)
        {
            try
            {
                file.Write(bytes, written, Mathf.Min(CHUNK_SIZE, www.bytesDownloaded - written));
            }
            catch(Exception)
            {
                //Error Handling Code -- snipped - Loads a different scene, destroying this object.
                yield break;
            }
            
            written += CHUNK_SIZE;
            yield return null;
        }
        www.Dispose();
        www = null;
        file.Close();
        file.Dispose();
        file = null;
        bytes = null;
    }
}

It seems I was too quick to place the blame with WWW. This problem was actually a quirk of the Garbage Collector.

Memory profiling revealed that, as I expected, each file download was accompanied by a 100MB jump in Mono’s used and reserved memory that was never released, causing significant performance issues and scene/texture load failures once it exceeded about 500MB.

The interesting part is that the 100MB jumps didn’t happen during the download or during the file write operation, but in the line

    byte[] bytes = www.bytes;

Whatever WWW uses for internal storage is invisible to the profiler, but the bytes accessor presumably converts it into a byte array in managed (Mono) memory. That massive byte array is the real problem.

Turns out I’m bumping into something called the Large Object Heap (or Large Object Space in Mono terms). This is a memory space where particularly large allocations are placed that isn’t treated normally by the garbage collector, and in some fringe cases like mine, it can actually be extremely inefficient.

There’s not really a “solution” that doesn’t involve dodging the WWW class (i.e., the ideal case would be to be able to write the bytes to file in small chunks as they’re downloaded, but WWW only gives us access to a monolithic byte array once the whole download is complete). However, by adjusting the end of the file write loop like so

    bytes = null;
    System.GC.Collect();

I can get the Garbage Collector to behave well enough that the wasted memory is kept below 350 MB or so, which lets the rest of the app function normally.