When attempt to run the OC Bake, it runs for a few minutes the Unity crashes to Desktop.
No error report, nothing.
The is no progress output, nothing mentioned in the console.
I am running Win7 64bit with 4GB of RAM, how memory hungry is Umbra during the bake?
Does the Bake require a camera called ‘MainCamera’?
Umbra isn’t particularly memory hungry. I’m working on a 4000x4000x750 terrain, and have been experimenting with the settings to reach something which works both with open terrain and with city-like clusters of buildings.
The 250 cell size seems far too big. You won’t cull much that way, and you’ll experience problems around and especially inside buildings.
Standard practice seems to be to work with up to three nested densities. One general, encompassing the whole terrain, one for clusters of buildings or meshes, and one for the inside of buildings. I’ve found that with a cell size of 1, the “Low” resolution setting results in chunks of 125 units to each side. The terrain uses its own occlusion mechanism, which isn’t under your control, but to get occlusion chunks of the same size, you’ll have to set the resolution to “Medium”. I use “High” for city areas, “VeryHigh” for some exteriors, and “ExtremelyHigh” for some tricky, complex interiors. This seems to work fine. Nested occlusion areas are handled just fine by Umbra.
Baking the occlusion takes about three hours for preview quality and just over 24 hours for production quality on my MacBook Pro, 4GB. I’ve never seen an occlusion related crash.
Exactly what the Quality setting (Preview/Production) does isn’t stated clearly anywhere, unfortunately. It doesn’t seem to affect the resolution of the culling in any way I can see, which is a Good Thing: just use Preview whilst working on the scene.
Thanks Peter, must be something else in my scene that is causing it to crash out?
As I mentioned before I have allot of cameras, if you watch my video you’ll see what my game looks like.
But it maybe something unrelated to that, unfortunately I get no error dump or anything.
I have similar problem (one camera), this is probably not something in the scene, but buggy implementaion of Umbra. When I bake simple scene, I have no problem, when I add more objects, umbra is crashing. Even if there is something which prevents correct baking, crashing is proof of bug anyway.
Well I’m a firm believer that setting up a scene correctly goes a long way to not having it crash. One guy in another thread complaining about it crashing was using 1.5GB models or something (why, or how for that matter, is beyond me) so yeah, kinda deserved a crash.
I didn’t say you were stupid. I said the guy using 1.5GB models in a realtime game engine then wondering why he was having issues was, lol. Just kidding
But seriously though, a correctly setup scene shouldn’t crash. Umbra is pretty hefty technology as is Beast, and I guess I have a hard time believing they’re always to blame. Nine times out of ten it’s usually an error on the users part. Now I’m not saying the software is perfect, it’s not. But I don’t think it’s to blame as often as people blame it.
Problem is even with correct setup, if you need detailed cell sizes and the world scale is large enough (ie at the point where you would really start to need pvs as the poly - vertex budget otherwise goes beyond the acceptable threshold for the target min reqs) it can none the less crash.
The only solution then is to raise and experiment with the cell size so you basically have less cells.
Its sad that you can’t impact the cell sizes of the view cells per zone unhappily, you can only impact the target cell size which touches only the occlusion of dynamic objects. Also ensure that the areas you placed don’t “waste space just cause you were too lazy” in case you have memory related problems, and especially use the areas instead of the automatic which just assumes the whole scene (maximum bounding volume) as one massive area - that is likely the most common reason for memory limitations
Even with Umbra, the above can still be useful and still IMO has a place in many scenes. For those really, really struggling to get around crashes, it’ll work a treat I would think.
Very simple to setup, no pre-calculation (so no risk of crashing). You do have to place the area’s correctly yourself, and that’s pretty much all that’s required (the automatic setup version works just fine from what I could tell).
I wonder if it would actually work WITH Umbra, as in optimize the calculations Umbra does if you were to setup his system first, hmmm. Might be worth checking that out actually, depends how Umbra deals with inactive mesh renderers I suppose.
having both side a side would cause some impact that on anywhere half current systems would negate PVS often as neither umbra nor anything else comes for free cpu wise. going through all objects and enabling / disalbing stuff cots time too.
also I don’t see much of a gain if you ask me, I would go with one or the other.
Well what I was thinking is would Umbra as it moved its virtual camera through the scene to decide what it could and couldn’t see per cell, take into account objects where the mesh renderers are switched off. Effectively allowing you to setup the above area’s, then Umbra would go through and at any time only be working on what each area decided was visible, fine tuning the occlusion further but only taking into account a far smaller number of objects and basically assuming nothing else exists, without concerning itself with anything beyond, cutting out the extra calculations that wouldn’t result in a visible response either way, which I’m sure is what really causes the slow speed.
Though I suppose setting up proper occlusion area’s in Umbra does the exact same thing doesn’t it… nvm lol.
The problem is that M2H and all others don’t work with the remainder set, the process their whole “datasets of interest” from ground up again too. so you effectively do the same work twice of checking what needs to be active and what not and at worst your own might fight over umbra if something now is meant to be active or not and alike.
hehe … if you do it smart enough it can benefit a bit from umbra.
but the problem is that your solution then no longer is long term stable because if umbras behaviour changes, you are messed up at best or fucked at worst.
I really dont see how can I setup scene wrong for Umbra. Terrain with few buildings works OK, I add more structures and crash. I tried even very rough settings with huge cell size over 10 for 20010050 array. It works OK, until I add more buildings.
I dont need umbra for 5 buildings, I need umbra for my 120 building city. This is main usage of this technology. So something like your buildings have 300MB of textures, or your setup is wrong is lame excuse. Even if I had worst scene setup under the sun, this NEVER excuse crashing. Crashing is always proof of BUG. Program should give you warning (with reason, something like “too many textures, too much polygons, etc…”), never, ever crash.
The problem is with static geometry in, it will generate significantly more cells, you see that when you go with the camera through the scenes, then the occlusion tree rendering actually breaks down on static object and instead of just the object you get a white “box cloud” depending on your settings.
If you accept it as its a crash or not due to out of memory, is up to you, but it won’t change the fact that there is no “too many” in the situation, just too little RAM available or too little address space to handle your data, its not 1 explicit thing, but the combination of it, thats why it dies when you add stuff, not right from the start. An error handed back would therefor be totally useless as all it could say is “your scene cell settings are too complex for the available memory, please change the settings, have a smaller scene, less space where the occlusion data is calculated or remove objects”
Would it help ya? no
It would only make you angry, that simple it is, yet its exactly the only thing it can tell ya.
Why do you think are there so many supporters for 64bit editor / baking pipeline at very least?
With well setup occlusion areas, the likelyhood of out of memory can be reduced drastically, thats a fact, if you don’t place any its near granted that it will crash even on rather simple scenes.
Even with low memory is possible to do large calcutions. Recently I was working on one very resource demanding SW. With 4GB of memory no crashes, under 1GB it crashed quite a lot. Do you think, that we told customer go and buy 4GB of memory? No, because this is unprofessional and customer would probably throw us from window.
We just corrected bugs and it run even on 500MB without any problem. Yes, it was damm slow, but it worked. Crash always mean BUG. Going out of memory must never crash application, application must be able to handle such situation. Buying more memory is not sollution, it just masks the problem. In future it can easily happen, that it will crash on 10GB as well, as long as these bugs are there.
“your scene cell settings are too complex for the available memory, please change the settings, have a smaller scene, less space where the occlusion data is calculated or remove objects”
or
“You have low memory, approximate time of baking is 1 year, do you still want to continue?”
Yes, such message would be welcomed, this is professional behaviour of application. I dont have to wait several minutes for crash. Crash is the worst amateurism. No matter how stupid settings I make, only result can be slow baking, never crash.
I of course support 64bit, because we have 2011. Other thing is, that Umbra works well, AFAIK it even supports muticore/multicomputer baking, so these crashes are messed up integration.