Hello. I am building mobile game with a lot of user interfaces and particle effects. I have started with Unity UI system (canvases, images etc.), and at some moment I have realized that it is very difficult to mix UI with particle systems because of sorting issues.
Now I use following approach:
- Place UI elements to the separate canvas and set sorting layer and offset in that canvas.
- Set particle system sorting layer and offset in renderer module.
- It works perfect, but as a result I have lot of canvases. And I have bad feeling about performance…
So my questions are:
- Is this optimal approach to mix UI and particles? If not, what is better?
- How many active canvases is ok? 20-30 is too much or not?
Thanks!
Whenever I want to put particles in UI, I usually render them with another camera to a Render Texture, and include that in my UI. But that workflow doesn’t work for every use case. Alternatively, if your canvas is in camera space, rendering to an orthographic camera you could just move the particle system closer or further from the camera which would change whether it appears in front of or behind the UI.
There is this, on the asset store. I can’t speak for how well it works, but it seems like it’s a custom particle system specifically for Unity UI.
I think rendering particles to a render texture actually is a bad idea. It works, but it is not a general solution. I have a big game with a hundred prefabs and a lot of sorting cases, when particles must be rendered between different UI objects. And I want to easily manage all that staff. It must work as a constructor, when everything is sorting correctly based on some rules…
I know there are 3rd solutions, but I think native Unity particles is better.
I can’t believe there is no good solution for sorting.
Looks like building entire game on UI system was a mistake.
Thinking about rebuild it without UI, on pure sprite renderers and custom UI system.