Which one is better, Show/Hide or Instantiate/Destroy?

I have “a same type” of Enemy GameObject that will show up on screen, but no more than 10 at a time.
At first i could just easily Instantiate and Destroy, which is actually easier to implement too.

However after reading from the forum, it looks like Destroy Instantiate could “sometimes” cause a memory
problem, because we don’t know when the Garbage Collector will run, and also how much garbage is left
hanging on the memory and causing out of memory.

In theory i thought Instantiate/Destroy would be better, because i only need to make 1 prefab and instantiate it when i need it (so the memory can be used by other stuff),
while compared to Show/Hide method, i have to prepare a total of 10 object (in array) that occupied the memory. (Also it’s slightly harder to code too :P)

So my question is, for this Iphone case do you guys think it’s better to use Show/Hide method rather than Instantiate/Destroy ??

thanks a lot in advance

for small numbers, show hide is likely the better solution, as destroy frees stuff and forces the garbage collector to react.

if you have many dynamic objects, the RAM restrictions likely force you to use instantiate destroy

thanks Dreamora,

i guess for my case, i’ll just make an array of 10 GameObjects and show / hide it from there rather than
Destroy/Instantiate.

looks like Destroy/Instantiate will increase the possibility i messed up something in the future if i don’t know what i’m doing :?

its pretty hard to mess something with either solution actually. But the destroy approach puts your fate into the hands of the GC, which can lead to stutter at worst. There are different reports on that behavior with the current version

My experience has been that once you instantiate a prefab, you don’t get the memory back after you destroy it, so you might as well just deactivate.

Maybe not right away, but you do when the garbage collector runs.

–Eric

Unfortunately, you don’t. I’ve tried waiting and running the garbage collector manually, to no effect.

You do
But the garbage collector will not free the system memory which you likely where supervising instead of the free managed memory.
Thats normal behavior for GCs, they won’t free system memory unless its is very cluttered because it would be a totally performance killing approach to give system memory free to just reallocate it again.

I’m not sure how I can look at system memory…I’ve been watching the Real Memory in the Activity Monitor instrument.

My test app instantiates and destroys a series of prefabs…eating a chunk of memory for each. I was starting the garbage collector after each destruction.

The pefabs are pretty big, and after about six of them, the app runs out of memory and crashes. I would think that the garbage collector would kick in before that happens, but it doesn’t.

Well the garbage collector can only free whats not used anymore.

If you have large wavs / large textures, they just will use the amount of RAM they require. A single 1024x1024 for example is 6MB of RAM .

The activity monitor instrument is worthless when it comes to free memory within the managed box.
It only sees how much RAM the whole app uses, not how much is assigned by the GC to existing data.

I had been under the impression that deleting the last instance of (and all references to) a prefab would free the memory associated with its assets, but it doesn’t seem to work that way.

Do you know of a better way to look at memory being used than the System Monitor?

Different solutions to this memory problem (I have the same issue you have with instansiated objects)

My solution was rather amusingly simple, all of my levels use the same objects so ultimately I will eat up the same max memory at some point, since I don’t trust the GC “at all”, I do a little dance with the object.

First, all the camera cares about is what is in its view range, what it can see is what it will render, I have an outer check statement on all objects during there update that looks to see if the object is “active” if it is then it runs the code block, otherwise, move on. Now the different between “active” and “inactive” is position.

See, I move my objects to a template playing field where they sit idly by waiting on them to be moved into play. Every object has a “home” position in a given level, when that home position is being approached, I move the object from the idle position to the home position, therefore I don’t recreate the object, I just move it, this goes for explosions and anything else, I move explosions at collision point, play sequence 1 time through, and move explosion out back into idle position. Since it is impossible for me to have more than 12 explosions at any given time, I don’t worry about instansiating and destroying the explosion object, just dance it in and out.

I find that if you get 100% of your objects instansiated at the start of the scene, you are at max memory use for the scene no matter what. If you move it out of view, it shouldn’t be part of the render phase, but the code will still be checked so you have to deal with that yourself, so while the object is active allow the code, when the object is idle, skip the code.

Also, totally eliminate the camera distance to minimal use, less rendering area less work. This is the only experiences I have to bring to the table.

1 Like

thanks a lot guys,
lots of useful info here.

i think i’m leveling up today :stuck_out_tongue:

The System.GC class might offer a bit of data related to the memory usage.

As for the references: Deleting = Object.Destroy / Object.DestroyImmediate, right?
if you let it just become nullified, you potentially create memory leaks depending on how you loaded them.

Yes this is what I am doing with another game…unfortunately the one I have the problem with uses the prefabs only once and moves on to the next one, and only about 1/3 of them will fit in memory.

Correct. I also tried loading another level, but the prefabs in the previous level still stay in memory.

This might call for a reduction of detail on either your objects or your graphics, I am a horrid graphic designer myself and my models are extremely sub standard, but I look at it this way, its a very tiny screen.

Were it up to me, that’s what I’d do. Unfortunately they are complicated animations that I don’t have the skill to modify, so I’m kind of stuck.

I just want that memory back :cry:

Wherefrom do you know that the prefabs remain in memory if you do not know yet how to find out how much RAM the GC actually is using and how much of it is cached / pool?

The iphone will never give you the correct data, the memory stats from that end are only usefull for leak and crash postchecks.

All I see is that memory in the system monitor climbing with the instantiation/destruction of the prefab, until the app crashes.

I infer from that that something is devouring the memory. Since the only thing in my test app is a series of scenes, each with one instance of a prefab in it, that it’s the prefabs eating the memory.

I’ll play with System.GC later on and see what I can figure out from that.

I would report this findings as a bug. I personally have been pooling everything, except for certain projectile type objects. I find that instantiate and destroy are just too slow. But I have experienced similar issues with memory not being freed.

If you are using Instruments to do testing, I don’t think it’s that accurate, as the probe eats memory and causes stability issues. I also question if there is something with the way instruments runs, that doesn’t allow the garbage collector to run. That being said, I’m not quite sure if the garbage collector even works. As it doesn’t “seem” like the resources are being dropped between Application.Load. However I can’t be certain. It could be that the garbage collection takes a long time, and memory runs out before it can process.

The current app that I’m working rides the memory limit. When I build the application to the iphone with just the game scene, no issues are encountered. If I start from the menu, and then load the main scene, the app crashes after a few seconds into game play.

So that tells me that something is not right. In theory, the memory should be freed between scenes. I had some non power of 2 textures in the menu that were gobbling up extra memory. I scaled them and compressed them and that alleviated the immediate issue, but it still begs the question of whether or not those resources are getting released at all.