How much does particle spawn rate influence perfomance in a system with set particle count

I’ve seen some conflicting information on this, so I wanted to ask to make sure:

If a particle system allows for a set number of particles (say 10), how much does it impact performance if you spawn 100 particles per second with a lifetime of 0.1s vs spawning 10 particles per second with a lifetime of 1s.

For context, I’m currently creating thruster effects, which means the higher I want to set the exhaust velocity the faster I need to respawn particles for an exhaust flame of the same size and basically I want to know whether keeping the spawn rate low is something I need to worry about, or if I can just go nuts with it as long as I keep the overall particle count low.

There are various bits of code run per-particle:

  • birth
  • update
  • kill

Updating is almost always the most expensive part, because it happens every frame. birth + kill are one-off events: they only happen once each in a particle’s lifetime.

So, while there will be some additional cost to spawning more particles with shorter lifetimes (more birth + kill events), I would expect the difference to be small. For the particle counts/lifetimes you quote, there should be no measurable difference. Maybe if you had 1000 particles, you’d start to see some difference, depending on your platform(s).

You can measure this quite easily too. Then you’ll know for sure! In an empty scene, create a particle system, and emit a lot of particles, like 10,000 per second, with 1 second lifetime. Use the profiler to measure the time it takes (timeline view, look at the particle update jobs that happen off the main thread)

Now change it to emit 1,000 per second, with 10 seconds lifetime. And compare. I’d be interested to hear your results.

Good luck!