We’ve recently had to convert a rather large project to “retina quality”. This involved bumping from 2048 to 4096 max sprite atlas sizes. Using 2DToolkit, this wasn’t a huge deal, just took a bit of time. For reference, think Talking Tom style game, with hundreds of large frames of animation built to sprite sheets, with each animation loaded and unloaded from memory on the fly. Tap his belly, load tickle animation, play it, replace with the always loaded idle animation, and unload tickle anim. Memory use is totally fine with this approach, and surprisingly load times for each anim were more than acceptable.
However, that was before retina came into the picture. There are approximately 75 of these 4096x4096 textures in the project. Using Unity Cloud Build service, we’ve been able to compress the atlases over the course of 8-9 hours of build time. At the end of the build process, it crashes with the old “out of memory” error.
I can understand how that might be possible, but at the same time, why is Unity ever trying to load multiple atlases into memory during the build process? We had to take some careful steps just to ensure we could build these sprite atlases without the editor itself running out of memory, but never envisioned that the build process would have the same problems.
Has anyone else been so crazy that they did something similar, and perhaps found a workaround for this?