We need to render particle between “Background" && “New Level”. Now canvas render mode is “Screen Space-Camera”, we only can put over whole UI or under whole UI.
How to make it working as we expected?
For this you would need 2 canvas’s. 1 to draw the stuff behind the particle system 1 to draw infront. This then splits the draw call up allowing for the particle systems to be in between. In the future we are hoping to have support for this without multiple canvas’s
Hi Phill
Do we have the support for the same in Unity 5.5x?
Hi
Any progress
ugui with particle and other element
not supported yet:(
3 years after…no news about this??? it’s currently one of the more anoying things in UI I think!
Yeah it sucks. Breaking up UI into different canvases is is not trivial. Especially when you have dynamic objects added/removed. Would be nice to add a component to a particle system, something like: UIBatchBreaker
This plugin looks nice: Unity Asset Store - The Best Assets for Game Making
Seems like it’s different than others in that it doesn’t re-construct every particle into a UIVertex. Although I could be wrong.
With NGUI I was working in a system that has different cameras (2D + 3D + 2D + 3D) with different depths an layers and in the same screen I had different nested Panels assigned and it worked perfectly, rendering things in every needed camera.
But in UGUI if I want to replicate the same behaviour I cannot because I have a parent Canvas with Screen Space Camera and my first camera assigned, but when I want to create a child canvas and change the camera to use , I have no option to do it… as canvas is not exposed in the Unity UI repo, do someone knows if I can hack this to have nested canvas rendering using different cameras???
Is there any other easy way to solve this nasty problem?
Hi @CDF , you are not wrong It doesn’t generate any UIVertices for every particle. It allows just use Unity Shuriken Particle System, with fast generated special UI depth/mask buffer. It’s way more efficient than generating UIVertices and much more simpler than adding multiple canvases.
From my experience and GUI programmers from company, adding particles or any other 3D objects to GUI was… very hard. We have tried scripts from asset store and from github which generated particle vertices, but it was not so efficient (especially for mobile) or user friendly. So I decided to make UI Particle System / UI 3D-System plugins (this second is for all types of 3D objects).
Cool, thought so. Does your system use the hierarchy order to determine draw order? Or do you need to separate things forward and back on the z axis?
It uses Z axis for sorting order. I have thought about hierarchy order but in case of 3D objects it would be hard to feel where should we put game object for models/particles (in final way you would have to still put GUI elements on different Z positions, so keeping additional hierarchy order would be additional unneeded step for user). My plugin just makes that GUI can behave, even on scene view, as 3D so setup is very intuitive. Additionally without using Z axis you wouldn’t make particles flying through GUI elements with soft-particle blending like this: