Loading 500 Textures with UnityWebRequest In WebGL is Throwing Memory Allocation Error

Hello. I’m developing an exhibition app with Unity WebGL. I’ve decided to load Images with UWP(UnityWebRequest) to reduce my first loading time. Because my base app is 70MB. All images are smaller than 2048x2048px in real size. And file sizes are 300-500kb. The average total size of the pictures is ~200MB. Everything is looking reasonable according to today’s internet speed. But somehow, my internet browser(Opera) gives an Memory Allocation Error when loading my images in to Unity WebGL app. It says that error:

WebGL.framework.js:3338 Could not allocate memory: System out of memory!
Trying to allocate: 5805543B with 16 alignment. MemoryLabel: Texture
Allocation happened at: Line:70 in ./Runtime/Utilities/dynamic_array.h
Memory overview
[ ALLOC_TEMP_THREAD ] used: 32768B | peak: 0B | reserved: 4194304B
[ ALLOC_TEMP_JOB_1_FRAME ] used: 0B | peak: 0B | reserved: 262144B
[ ALLOC_TEMP_JOB_2_FRAMES ] used: 0B | peak: 0B | reserved: 262144B
[ ALLOC_TEMP_JOB_4_FRAMES ] used: 0B | peak: 0B | reserved: 1048576B
[ ALLOC_TEMP_JOB_ASYNC ] used: 15440582B | peak: 0B | reserved: 17039360B
[ ALLOC_DEFAULT ] used: 50305651B | peak: 57639027B | reserved: 51652762B
[ ALLOC_GAMEOBJECT ] used: 393738B | peak: 926820B | reserved: 419199B
[ ALLOC_GFX ] used: 2014771894B | peak: 2014771894B | reserved: 2014809667B
[ ALLOC_PROFILER ] used: 866833B | peak: 4453539B | reserved: 970108B

I’m not getting what is the problem. This numbers are huge according to my 200mb(204800B) images. Here is my script:

public IEnumerator GetImage(string mainURL, string imageFileName, System.Action<Texture2D> callback = null)
    {
        //print(url + imageFileName);
        using (UnityWebRequest uwr = UnityWebRequestTexture.GetTexture(mainURL + imageFileName))
        {
            yield return uwr.SendWebRequest();

            if (uwr.result == UnityWebRequest.Result.ConnectionError || uwr.result == UnityWebRequest.Result.ProtocolError || uwr.result == UnityWebRequest.Result.DataProcessingError)
            {
                Debug.Log(uwr.error + "\n" + imageFileName + " is missing!");
                yield return null;
            }
            else
            {
                Texture2D texture = DownloadHandlerTexture.GetContent(uwr);
                callback?.Invoke(texture);
                uwr.Dispose();
                yield return null;

            }
            yield return null;
        }
       
        yield return null;
    }

Is anyone can explain to me what is the problem. Am I facing some kind of technical limit or doing something wrong?

They require a multiple of 500kb because they are not loaded in compressed form.
And you can not clean up memory without closing the tab. (afaik)

you can try with 512x512 images, but i think it’s still to much.
or try with less images and see how many memory is used.

here is a MemoryStatsPlugin (jslib) pearhaps this helps Unity Blog

I also use image-uploads for users, and the only way i found was to resize all textures above 512px textures.
And hope that there is enough memory aviable until they close the browser.

1 Like

I’ve tried again to load 500 images. And I’ve watched the windows task manager about ram usage of opera. I’ve saw that it’s given error after passing 2gb of ram usage. After that I’ve tried to load 300 images. It’s completed without error. Ram usage got close to 2gb limit, but it’s not passed that limit. Everything’s expected until that point. But I’ve noticed a new thing. Ram usage of opera’s dropped to 400mb from 1.8gb after the texture downloading and assigning operation. With that observation, I can confirm that garbage collection is cleaning somethings.

I’m starting GetImage routines in a “for loop” for my all different object. I think, it isn’t clean the loop until finishing to load every texture. My loop function:

public void DownloadLayouts(Hodja[] hodjas, string mainURL)
{
       for (int i = 0; i < hodjas.Length; i++)
        {
            for (int j = 0; j < hodjas[i].finalDocuments.Length; j++)
            {
                string imageFileName = "blablabla.jpg";

                int si = i;
                int sj = j;
                StartCoroutine(GetImage(mainURL, imageFileName, (sImage) =>
                {
                    hodjas[si].finalDocuments[sj] = sImage;
                }));
            }
        }
}

I don’t know how to make this Garbage Collection friendly. Any opinion?

1 Like

When requesting an image, you can set it as non readable.
If it’s readable, Unity saves pixels array in memory.

2 Likes

That has solved my all problems :smile:. Really thank you. :):):slight_smile:

1 Like

Good to note here is that WebAssembly has a 2GB memory usage limit. Planned for Unity 2021.2, this should be possible to be bumped up to 4GB, but currently only Chromium based browsers will support that 4GB limit.