I can’t find the correct sub-forum for this so I think I just post it here.
I recently played an Android game called Bombsquad, and I was amaze by the explosion effect:
I just can’t see how could he achieve that smoke, the shape were so realistic, like one you could see in FumeFX, not some mobile game. I don’t think it simply a particle system because even if you figured out a way to create shapes like that it still required a huge amount of particles to achieve that smooth look.
Of course it can be pre-rendered because the camera can rotate very little, but I see no clipping when the character touch the smoke or on the ground, I thought it impossible to do that on mobile with Unity:
That smoke,distortion effect and what appear to be realtime lighting, in an (up to)8 player game, can run smoothly on my Samsung Note 2 (released about 4 year ago). So can I do this in Unity?
It’s a particle system, use realistic image you fade slowly, mix sharp and blurry streak, the image just drift it’s not moving fast like typical particle system, scale change slowly non uniformly, size is dependent on angle, big image shoot up, short goes sideways. Regular particle system for the spark
If you look closely at 0:22 (turn the speed down) you can see that spark actually lead the long smoke streak, but it also interact with the floor. Of course in this case the floor is flat so you can bake it in the clip, but the bomb can explode any where. And the way the trail of smoke shaped, I don’t think it can be a particle system.
You can do the same thing in Unity with experience and skill. You know you can combine custom shaders and scripting with the built in particle system or roll your own with quads, shaders, scripting.
Directly from the developers of the game: “The trick with the smoke was to render it as little strips of polygons with a fancy smoke shader attached, and to let the vertices drift apart over time to get the expanding/distoring look. Hit F11 in the game to go into debug-drawing mode and see how it works.”
Thanks you all for your help!
To sum this up I would say the developer use a rendered image sequence from a fluid dynamic simulation, put them on a mesh that fit their shape, then animate each vertex of those mesh away from each other randomly so they look slightly distorted. Those mesh also need a special alpha-blended shader so that they doesn’t clip. Other than that he just use normal particle system.
Still, I amazed by the way he put realtime lighting and distortion effect on mobile. I never see any mobile game using both of them, the last time I try to port Survival Shooters to Android, realtime lighting from the muzzle flash effect make the game unplayable.
It is also very black box and difficult to work with in a large project. They are apparently working on it, but their Unity version suffered from a lack of attention for a while.