Hi, I’ve published the SpriteToParticles package some time ago.
I’m making this thread to take questions, suggestions and inform about new updates and that kind of stuff.
Update! Now it works with the Unity UI Image component. No support for sliced images at the moment though.
Stay tuned for more info
pd: if you already bought the plugin, share a screenshot! I really want to see what you are making with it.
This looks really really cool, and could be absolutely perfect for what I need with the new UI support,
however I’m not confident in purchasing it currently as both your documentation and webgl preview links are broken/that entire site seems to be inaccessible?
Good idea, but I am wondering about the performance side of things, since the system seems like it is reading texture per frame to get the info. So if you have frame animated sprites and their size is large, isn’t it going to hit cpu a lot per frame?
I think this would work much better if it can be linked with clever compute shader and somehow update the particle system from there instead cpu reading texture per frame…
@SwiftIllusion That’s really weird, I checked the site and it’s up and running. Please try again, if you still can’t log then I guess I should call the hosting service…
Also, I uploaded the main UI Demo scene, it’s here
It doesn’t show up in the package description because I’m fixing some strange bugs reported by users and I’ll upload everything together along with a sample usage video.
@castor76 Well, actually, you get 2 main components: one static and one dynamic.
The static one will cache all the texture data only one time and save the needed info to emit the particles. This is the component used in the Forest on fire Scene. It’s static in the sense that you wouldn’t change the sprite source image but you can still move the game object around or scale and rotate it.
The dynamic one, yes, it will read the texture every frame you are emitting particles. This one is supposed to be used with animating sprites, this is, constantly changing the sprite in the Sprite Renderer component. Like the first two scenes in the demo.
But, you can cache thoose too. This component has a CacheSprites option that would cache all the sprites as they come along. The thing is that this would be increasing memory size with every new different sprite.
So, it’s up to you if you prefer more memory consumption or cpu consumption.
I spent a lot of time making the plugin as performant as it could be. I want the tool to be as robust as possible.
All the things I’m stating here are explained in the documentation file.
I don’t see how one could make this with a shader as it uses the Shuryken particle system. The tool only works on emission, after that the particle system process the particles.
I hope that aswers your question. If you have any more questions I’ll be glad to answer.
Ok so it seems like it tries to cache the data at least. Which is good thing. In that case, the game will probably need to “warm up” all the animation before the game starts to play because we don’t want the player to experience any lags during this caching period. For this purpose, it would be nice to have API that we can “supply” the sprites that is going to be changing so that we can “bake” or prewarm up the cache data before we let the game start playing. Without this API we need to force “play” all the animations in runtime in order to cache things up which gets a bit ugly.
Another optimization idea is that instead of storing vector3 position, you could just store vector2 and then use sprite’s base transform z value instead. This will save some memory at the cost of possible vec2 to vec3 conversion but it maybe a worthwhile tradeoff. If the game is flat 2D then conversion is automatic because you can just say vector3 pos = vector2 pos.
Regrettably that link still doesn’t work, even going to " http://numbloq.com/ " gives me “connection has timed out”, have you tried visiting your site from another location/mobile etc, maybe it’s because your logged into its control panel or something that allows you to visit it without issue?
Additionally just out of interest regarding the performance discussion, would the GUI be factored as static or dynamic?
If an interface image is being scaled with DotTween for example would that still be static or dynamic, or would it not function for elements like that?
@castor76 I’ll add something like that API you are telling, maybe in the next update.
@SwiftIllusion I called the hosting, they say everything is ok. I’ve asked some people around the world to check if they can access and they can. Let’s continue this in private.
At the moment (as specified in the documentation), the UI support only works with the dynamic component. The static will be added in the next update.
I haven’t checked the DotTween yet but it should work as expected. I’ll confirm you this tonight when I get my hands on it.
I randomly get 2 errors in the console (not during play mode):
Shader error in ‘UI/Particles/Additive’: ‘v’ : undeclared identifier; ‘vertex’ : illegal vector field selection; ‘UnityObjectToViewPos’ : cannot resolve function call unambiguously (check parameter types) ‘z’ : vector field selection out of range at line 76 (on gles)
And the error is immediately followed by:
Shader error in ‘UI/Particles/Additive’: undeclared identifier ‘v’ at line 76 (on gles3)
So line 76 in UI_Particle_Add.shader
I can still run the game fine. Just thought I’d let you know. I’m running Unity 5.5.0f2
Is it possible to “prewarm” the particle system? I’d like the particles to begin as if they’d already been firing off beforehand on startup.
Is it possible to disperse the particles radially? Normally I’d do this using a sphere Shape. Seems I can only add forces in a particular direction.
Yeah, I’m fixing those things right now along with other issues that arose with Unity 5.5+
I’ll upload it tomorrow to the store.
No, I was researching how to make that work but as the nature of the asset is to replace the particle system emission that feature turned to be quite tricky. I’ll keep trying to come up with something.
Again, the Emission and Shape are overriden by the asset. I’ll be adding some kind of effectors to deal with those situations but it will take a while before they become production ready because of performance.
This kind of effectors are high up in my todo list.
I am just playing around with your spritetoparticle asset to create UI menu buttons and its really great so far.
However I have a small issue at the moment… the script → “Static UI Image Emitter” doesn’t seem to work - since it also doesn’t have the “Emission” properties in the settings, I wonder if this is not enabled? Sure I can use the “Static Emitter Continous UI” but since its only for menu buttons which don’t move, wouldn’t it be more performant to just the static emitter?
BTW do you think it would be somehow possible to create some kind of “effects reuse” - what I mean is if we would copy several objects/images and always have to use a new emitter with a new renderer and a new particle effect instance - this get quite compute intense and slow on mobile devices…?
That’s right, the component “Static UI Image Emitter” won’t emit any particles. The components inside the “Core” folder are the base ones and won’t emit. This image from the documentation might clarify it a bit:
That inheritance diagram is for the Sprite based emission.
In the Image UI case that OneShot component doesn’t exist but I maintained the class structure.
The OneShot on UI doesn’t exist because it was ment to deal with relatively big images that emit a great amount of particles (as show in the “Static Oneshot Emission Demo”, the one with the html5 logo).
This wouldn’t be wise to be done with UI emission because the particle system’s renderer module is overriden by the asset and has a particle limit count.
So, I maintained the class structure just in case someone really needs that particular component in the future. Unlikely but posible.
About the “effect reuse”,
If you are talking about reusing the same texture, like a 20 same-texture buttons running at the same time, yeah, that would be an awesome add-on. A way to cache the same image for different emitters.
I have to think about a nice way to handle it to be easy and straightforward to use.
But bear in mind that this would only mean less memory consumption.
You’ll still need an emitter, a renderer and a particle system component just like any normal particle system you would add in your scene.
Sorry for the late answer, but I was busy studying the unity courseware and forgot to check back to the forum (thought I would get an email…)
Well that answers my qustions so far.
Regarding the idea behind image effect reuse… from what I can see the performance hit a comes from “scanning the image” itself for the colors (I guess for static images this is done only once or?). But I saw that it gets very slow as soon as the image grows. So I guess that scanning all the pixels from an image is the bottleneck. But couldn’t this process be done only once and then be reused? And if you had multiple objects, you could apply the same effect on other objects too, without the need to scan the image all the time. Or am I thinking totally wrong here But since I am pretty new to unity… that’s just what I came up on my mind.
The idea behind is, since I want to create a trading card game (that’s also why I may use UI canvas) to have different border lighting effects on them etc. - but I would need this very performant and copied over to several objects. As we know mobile devices are limited, so I wonder how I could achieve this
Ok, let’s separate concepts:
1 - There’s reading of the texture from unity, the call to GetPixels method on the texture (Unity - Scripting API: Texture2D.GetPixels)
2 - There’s the reading from the Color[ ] array returned by the above method.
Static emission will do 1 and 2 only once. This is done in the CacheSprite() method.
One would do that on the Loading state of the game for big textures, whether using the “CacheOnAwake” setting or calling it manually at a prefered moment as it would take some miliseconds to load (always depending on the texture/sprite size)
After the cache is available (there’s an event to attach to when cache is complete) the performance will only depend on the amount of particles your are emitting. It doesn’t mather if you have a 16x16 texture or a 4096x4096 texture.
Dynamic emission will do 1 only once if the “CacheSprites” setting is enabled. This is per different texture referenced in the Sprite component or Image component (this depends whether you are using the Sprite version or the UI version).
If “CacheSprites” setting is disabled it will do 1 every frame.
It will always do 2 every frame; the cost of having the ability to change texture and/or the color source emission settings on the fly.
The other concept: the ability to do 1 and 2 (using static emission) on one texture for serveral objects at the same time, aka: sharing the cached texture between different objects.
That’s is not available yet but it’s on my todo list.
Again, this feature would only have an effect on memory footprint.
In conclusion, for your specific case about the cards game I would go with the static emission. Remember that with static emission can still change position, scale and rotation on the fly.
Well then I guess you are right and my suggestions for sharing the “effect” would not really help performance. I thought or hoped there is a way to e.g. completely reuse an emitter to share its effect or something like that - like applying a shader on a material and then reuse this material onto several objects.
But at least it would help on the memory side of things
But, I got a trouble.
It is infinity increase of materials.
I fixed A to B and it was okay.
Make sure there is no problem with B.
79~83 In UIParticleRenderer.cs
A.Original
Shader foundShader = Shader.Find("UI/Particles/Additive");
Material pMaterial = new Material(foundShader);
if (material == null)
material = pMaterial;
B.Fixed
if (material == null)
{
Shader foundShader = Shader.Find("UI/Particles/Additive");
Material pMaterial = new Material(foundShader);
material = pMaterial;
}
Thanks for this asset, it’s really very useful.
However I can’t get UI particles to work on Android (including the demos). Could there be any project wide settings which I need to change to get this working?
That’s weird, it should work out of the box, can you send me an email stating:
-Unity build version,
-screenshot or info from BuildSettings (Edit->Project Settings->Player)
-screenshot or info from GraphicsSettings (Edit->Project Settings->Graphics)
-are you using the provided UI shaders in the package?
If you could also send me an .apk to test that would be sweet.