This is a bit of an open ended question - but there does seem to be a correlation between the WebGL heap size, and crashes for a % of users. Too high - and the user gets one error, too low, and we get another.
Preamble: We have an application which loads assets on-the-fly (it’s a social virtual world platform); we do a decent job of cycling stuff in and out of memory; but WebGL has forced us to be particularly stringent; and even then we’re running into regular issues still (compared to the webplayer, it’s a damned nightmare).
It’d be extremely helpful to have some guidelines on what heap allocations work for most people. We seem to find 250-300mb is about our maximum that we can obtain reliably from browsers. If I could get 600-700mb, I’d be ecstatic.
Is Unity tracking what heap sizes can reliably work? Do any developers have any personal insights into what’s the largest heap size you can reliably get away with? (it’s the WebGL Memory Size parameter)
I doubt this will help, but I’ll share my findings … we are about the same. I get flooded with Out of Memory bug reports every day. More often on some browsers (Firefox on PC and Chrome on Mac seem to be the worst offenders).
A couple of items to mention beyond what they recommend in their docs:
Make sure you are on the latest official release, and really read through any recursive functions you have. We found that some libraries like SimpleJSON on 5.3.1 would attempt to hog up-to 1.7GB just when trying to parse a single JSON Array with a single element no longer than 50 characters. I also found the same would happen occisionally wiht basic functions such as String.replace.
Upgrading to Unity 5.3.2, without changing any code, these problems immediately went away.
Also, watch your texture and material creation - they don’t get destroyed normally, so you need to destroy them by hand.
Some tips for debugging. Enable the dynamic memory option for WebGL for debugging. It makes Firefox and Safari run like normal, Chrome will run absolutely miserably. But when your app requests more “RAM” - it will show in the console log how much RAM is being requested. And assuming detailed logs are being dumped to the console, you should have a good idea of where the problems are occurring and how much RAM you need. If you have an on-screen debug mode, you can use something like the following to output some of the memory use, although in my experience it is no where near what is actually being used as according to the browser:
In many versions, trying to debug through Profiler halts, crashes or just doesn’t launch. But if you CAN get it to work, you can see that it does some very interesting / very different stuff. That is often quite different from the Editor. And this goes beyond just regular GC.
The official advice I got from the Unity guys was “write less code” and “remove assets”. However, all the out of memory issues we encountered we’ve actually been able to resolve by changing build versions…
Regarding the WebGL max size, I believe it impacts more than just the heap. To give you an idea, our package typically runs around 200MB. However, during uncompression, it spikes in excess of 600MB for a period on some browsers - so with a limit of 512MB it was crashing out. We’ve set the WebGL size to 756MB… But there is a challenge with this, if their browser is already using up a lot of memory from other tabs, if the memory isn’t available, or you are using 32bit versions of browsers - 756MB will generate a lot of out of memory errors for the average user - this will happen before the content can even load. The only real fix I can see to these problems is dynamic memory, but again, it makes Chrome near impossible to use.
Ironically - the device we get the least amount of complaints about RAM use on is Safari on iPad. Although that one runs about 5 FPS.
Interesting notes! Thanks for providing this - I was just curious about what exactly you mean by “texture and material creation” - would Instantiating objects that use the source object’s material be applicable? Additionally, what method do you use for destroying a Material in that case?
Also interesting to see the “write less code” suggestion…sadly I do have a rather large project, made very quickly, so the code is not very streamlined…
i try to never use more than 512 mo
any more than that and you start to have a lot of trouble with chrome.
our game run within 256 mo and if i allocate exactly 512, chrome start to be inconsistent with errors
also ya the code itself have his weight in the balance. Try to make functions as generic as possible.
By example we have a static class named FunctionExt with a lot of generic coroutine for interpolations and other stuff.
also try to hunt for optimization, use the crunched import setting on every texture