So in our game we have a lot of big scenes with a lot of objects but when baking realtime GI, even on 0.05 resolution or so, Unity crashes after a while due to lack of memory which makes GI unusable. Now my system is fine since it has plenty of memory but it seems Unity did not really test big scenes yet(We have scenes 10x larger than viking village). IMO it should never happen that it crashes due to lack of memory but it should divide the scene in manage-able sizes. Is there perhaps something I am doing wrong or is it indeed a Unity issue?
the same problem⌠waiting for answer
Hi! We have tried baking REALLY large scenes, and we have seen jobs allocating up to 88 Gb of memory, so this should not crash on a 64 bit system with enough free memory and disk space for swapping. What kind of numbers are you seeing?
@KEngelstoft @Joachim_Ante_1
Hey, is there any way I could get one of those REALLY big scenes, or at least the mesh? It sounds like youâre making decisions about how realtime GI will work with loadleveladditive. Iâm concerned because my scenes have been too big in the past and the only way I could bake them at all without crashing or get good results was baking selected or baking in separate scenes. I dont have any reference at all as to what Unity would consider a really big scene or how the meshes are set up.
I have also had large scenes crash while baking in Unity5. Simple scenes, just base geometry, no prefabs or props. And I have a pretty high end PC. Thanks.
@Zylex and @RogerRen did you file bug reports already? We would love to take a look at your scenes and fix the crash.
Hi! I am afraid I canât share the scenes with you, they are built by our customers and therefore is not ours to share. A big scene is something the size of a large city with lots of meshes (several kilometers).
The size of scenes that can be baked isnât related to how LoadLevelAdditive works.
If you still have scenes that crash Unity, please file a bug report so we can try your scene.
I understand about the scenes, I didnât know if you guys had some you used a benchmarks, doesnât hurt to ask.
LoadLevelAdditive was my solution to scenes that were too big to bake in Unity4.
I tried with 20 gig of GI cache, running into a similar issue. Our scenes have 200k - 300k polys, so the scenes are not that big. It only happens when Final Gather is active though. Gonna try to give it the max of 100 GB now and gonna see if itâll work.
@KEngelstoft I have 16GB memory and 100GB+ disk space on 64 bit system so that should be plenty right? Anyway I submitted a scene that does not work for you as requested: Case 682451. Note that baking this scene also gives me a lot of errors which we discussed in this thread: Console Errors while baking - Unity Engine - Unity Discussions
Did the same as @Zylex now and went with the max of 100 gb of GI cache. I âonlyâ have 8 gb of ram though and getting similar errors:
"Failed executing external process for âBake Indirectâ job. Exit code: â-1073741819â. "
" âBake Indirectâ job failed with error code: 4294967295 (âUnknown error.â).
In my case it really seems to be a problem with insufficient RAM, task manager keeps telling me that 7.8 GB are permanently in use.
Thatâs where Iâm currently at. Unity seems to be still baking â12/15 Bake Indirectâ but I have no idea if itâll ever finish.
Is there anything I can do to maintain quality while transfering the workload from ram to GI cache to prevent the errors?
Or is there any chance that the whole baking process will be improved in terms of ram usage? I donât think 8 GB are too less for lightmapping. Weâre doing 4k renderings, realflow fluid simulations and (of course) lightmapping in Unity 4. And none of that comsumes by far that much RAM.
Any help or statement is greatly appreciated
Exact same issue as @Zylex , submitted a bug report (http://fogbugz.unity3d.com/default.asp?680825_9uqtg4mhvd9bld2m), to which I got the âit works on our machineâ response
@Zylex are you on Windows 7 by chance?
I also have that issue. Iâm trying to bake with no terrain, if I run out of memory, Iâll try to send the scene.
Having the same issue with a relatively small scene, GI cache was set to 10, will try for 100. Tried to pack scene into a bug report but the issue usually causes my entire system to halt to a crawl, and sometimes a black screen and then Unity crashing. Have had to force shut down my system about twice now.
Maybe related, in the newest Unity patch, it seems to fix some things (now I donât get any âfailed to read ⌠fileâ errors), getting these other errors though, even with AO set to 0: Imgur: The magic of the Internet
I did notice a similar issue and filed a bug report a while ago concerning the AO errors despite Ao of zero. Support ansewered that it was reporduceable and will probably be fixed.
It would be cool to have an answer from UT about this. I canât bake a large scene (terrain is 1000x1000) with default Baked GI resolution at 40⌠Iâm forced to do it at 20 but I get artifacts. Iâve tested on a small test scene at a resolution of 60 and all artifacts are gone.
This is quite limiting, forcing me to use realtime lights/shadows, and killing performancesâŚ
I have 8GB of RAM, using Win7 64bits. Iâm planning to get 16GB RAM but without being sure itâll build itâs crazyâŚ
Filed a bugreport for it (Case 804129) but canât upload my entire project (or spend days to isolate a test scene), looks like this issue is still present? I thought that by waiting til 5.4 I might be able to use the lightmapper and get some sort of usable result, after 3 days playing with it and getting either completely unusable rubbish or crashes, it looks like I was naive to make that assumption.
So, according to my knowledge, Unity crashing while baking large scenes on enlighten has nothing to do with a fundamental bug in Unity or Enlighten. I believe the incredible amount of memory usage is due to the way enlighten generates global illumination.
What Enlighten does is, for every texel in the scene, it calculates the visibility to every other texel in the scene. So this means that for every new pixel of data, you need to add n new floats to the lighting array (n being the number of pixels in the lightmap). Essentially, if youâre trying to generate a 512x512 lightmap, you need (512x512)^2 * sizeof(float) to actually calculate the map. And the worst part of all of this â enlighten needs to do this calculation whether or not youâre doing realtime global illumination, its just how their algorithm works. So as this algorithm is n^2, its not difficult to see how the numbers can get blown out of proportion really quick.
So a 1k x 1k terrain baked at a resolution of .3 will probably generate close to 512 x 512 pixels of light map. So this comes to (512 * 512)^2 * 4 bytes = 27Gb. Enlighten probably does a bit more memory optimization to bring this number down a bit, but fundamentally, computing a large light map with Enlighten is an extremely expensive thing to do. Iâm realizing now that Enlightenâs strong suit is small, detailed scenes with lots of different objects. In large scenes, you simply canât bake global illumination at any decent speed at any decent resolution.
Direct lighting though, in theory, shouldnât have as large as a memory impact â results from each texel are independent of all the others. And it seems to me enlighten calculates the direct lighting separately from the indirect. Do you have a ton of area lights though? Enlighten does sample every area light for every texel in the scene though, so this could be taking up a fair bit of memory.
TLDR, the memory issues in large scenes as far as I can tell are due to how Enlighten works, not a specific bug in either Unity or Enlighten. Iâm crossing my fingers at the moment that SEGI will allow us to generate better light maps quicker than what we currently have because there isnât any robust solution for generating large lightmaps in Unity right nowâŚ
You are right, if the memory usage at some point goes beyond what the operating system can provide to Unity or any other program, a crash is hard to avoid because at some point the page file will not fit on the harddrive.
You assumptions about how Enlighten works are not correct. Enlighten is calculating lighting using the clusters that are generated in the Clustering step and those should ideally be larger than the lightmap texels. The step where geometry is converted to clusters can be quite memory intensive if the scale isnât set correctly.
If you are seeing high memory usage or long baking times it could be because the static geometry in your scene is getting cut up into many more clusters than what is actually needed. UV charts scene view and clustering scene view can help you identify the geometry that needs to have UVs or scale tweaked.
There isnât any global dependency from one cluster to all other clusters, the dependency is only between clusters in the same system. Use the systems scene view mode to get an overview. A large terrain plane with objects on top could force all objects into the same system which would increase bake times.
I get the same problems on assets download from the assetstore. To get around this, I had to lower the following settings.