Particle Playground (Framework for controlling particles)

What’s this?
Particle Playground is a toolset which enables you to be creative in new ways in terms of altering and rendering particles in Unity. The framework enables a couple of different methods to emit particles as well as control them with different forces during their lifetime.

Now released!

Important
Please use this thread instead: http://forum.unity3d.com/threads/215154-Particle-Playground

A new site is up showcasing the framework, try it out here: http://playground.polyfied.com/

Feedback in the thread is more than welcome!





1436328--76767--$Skärmavbild 2013-11-29 kl. 23.58.52.png

Previous:
The last couple of days I’ve been working on a framework for controlling the Shuriken particle system, where you can pass in an image or mesh and rebuild it as particles with a pool (for performance).

The particles follow something the framework calls states, a state can be looked at as a snapshot where position, color and size is stored. You can pass in an image or mesh with a couple of different parameters (for instance an image together with a depth map to give the image z-positions). Along with that you can additively add onto a state, where you for instance can store several meshes into one state. On top of this there are some tools built in for ease of use for the more common tasks, for instance linear interpolation (lerp) of a complete pixel2particle object; This makes it easy to blend between states, regardless whether they are images or meshes.

These states are not compatible with animated skinned meshes in the scene due to their more complicated vertex access level - and the fact that they are live in the scene, therefore there’s an extension for something the framework calls world objects and skinned world objects. A world object is a state live in the scene and computed in realtime, a good example is the skinned mesh emitter demo where the particles positions on every mesh vertex to fade towards their reuse in the particle pool.

You can try it out here:
Pixel2Particle images and meshes demo
Pixel2Particle skinned mesh demo
Pixel2Particle skinned mesh emitter demo
Pixel2Particle depth mapping demo
Pixel2Particle manipulators demo
Keys: W zoom in, S zoom out, Mouse rotate

The system now calculates particle behavior in its own Update cycle which makes it very handy to work with manipulator objects. You can try it out in the latest webplayer

Are there any features you would like to see in it? I intend to release it in the near future when it’s a bit more polished.



That’s actually really cool. I can’t say I have a use for it at all, but still pretty cool.

Might be pretty cool to do something like this for a breaking glass effect.

Thanks, yeah I’m not sure what I would use the actual implementation for as is either. :slight_smile:
I’m thinking perhaps evolving this towards being able to lerp through images and make the particles shift over time, maybe become meshes as well. Futuristic holographic workstation-ish. The actual use would perhaps fit better for medicinal or “scientific” purposes in the end.

Meh… cool is cool, it doesn’t need to have a practical application right away. :wink:

One thing that might be cool is if the pixels could generated from a texture on a mesh and positioned in space relative to the origin spot. You could use it to create effects like characters turning to dust or swirling or morphing between two things in a visually interesting way. This may be pretty expensive to do in real time on the fly.

Very cool though. Look forward to seeing where this goes.

Very cool!! Love it. Yes could definitely be used as a transition state.

Thanks guys, really nice feedback, I appreciate it. I like the idea of transition states, the layout isn’t too far off from just an overload using several states as well, I’ll go down that road and let you know what I find. :wink:

Placing them onto a mesh in regards of uv-mapping should be fully possible, it’s a really good idea, have a hunch it will require some work though but worth digging into.

It will be expensive for sure, perhaps I can implement a resolution, power of two downsize. I’ll check that out as well.

Wow that’s really clever! :smile:

Aside from using it for kickass transition effects (logos blowing away, images blasting into particles then reforming as another one etc) the first game-related idea I could think of was doing sci fi 3D billboards / displays. You could keep the pixel density deliberately coarse so it would look blocky when you approach it (like outdoor LED displays do, even though they look like TVs from far away) or even better, have the particles move out of the way when a character moves through them, kind of like a digital beaded curtain.

Oh! Every time an explosion goes off, the pixels fall out of alignment then recover!

That looks great!

I’d like to be able to use Normal Maps with this to specify the z-depth of objects in the image. This could be used for creating true geometry from images instead of the faked geometry of shaders. It would probably require some pixel anti-aliasing to look solid, though.

Taking that approach may also be the solution to creating a mesh. The mesh would be constructed along the color changes in the Normal Map, rather than making each pixel a vertex.

I had some time to work on an update tonight to implement functionality for texture arrays with transitions. What you basically could do now is to let the particles play a (short) movie. All it does is to create classes with information about each image, then there’s a couple of helper functions and classes which lets you work through the different states.

You can try it out here: http://www.polyfied.com/development/unity/pixeltoparticles/webplayer.html

I’m thinking I would like to have manipulators to this as well, so you can interact with the particles a little bit more intuitive.

I like the ideas, it’s a bit expensive but when I get in an implementation for resolution it should be better fit for a game scene. Letting the player interact with the particles is something I’d really want as well in the end, I’ll look into creating a couple of tools there. The good thing with having a pool of particles is… having a pool of particles :slight_smile: but also very efficient when iterating through them and let algorithms take a hold of them.

This sounds very interesting, I’m afraid I’ll have to read up a bit on normal maps in general but it sounds intuitive and something you would want to be able to do. Heightmapping is just a z away. :slight_smile:

Things are going quite well at the moment, but there’s a lot to be done before this sees daylight. The most obvious addition is that states now handle meshes and/or images.

Same link as before to see it in action (web player).

Here’s what I’ve been working on the last days:

  • Mesh support
  • States seamless to images and meshes
  • Heightmap support (nixter’s request)
  • Resolution support
  • Size support
  • Possibility to add new images and meshes into existing Pixel2Particle system

Future support:

  • Additive add (support for adding onto a state, for instance several meshes into one state)
  • Nondependent vertex mesh structure (need more points to particles)
  • Particle count scaling (rebuild particle pool according to current state positions - choice by user as a function)
  • Manipulators (tools for interacting with the particles, FlaxSycle’s request)
  • Functions for moving and animated meshes (non-state realtime mesh update)
  • More example algorithms (for controlling the particles)

What I struggle with now is the concept of adding particle positions of meshes, where some should be nondependent of vertices (nixter’s request). In a perfect world I would like to structure the particles according to each vertex with a grid in between (I’m thinking some barycentric-ish solution) where each particle inside is colored as the calculated point for uv. Preferably with approximately the same distance independent of vertex positions. I have a lot to learn here, and I’m looking for input from someone experienced in this if you have the time! I guess it’s what shaders commonly do in pixel shading (wild guess at the moment). I would also like to sort the mesh particles as the images pixel positions to get a better fluidity in transitions between meshes and images, but that’s a later problem.

That’s pretty sweet!

Tell me…

When and HOW MUCH :smile:
I am extremely interested. The random effect reminds me og Halo : Reach like loading screen.

Just finished up realtime skinned mesh support. This makes a particle able to follow an animated mesh in world coordinates.

Test it in action here: Skinned mesh demo

Glad you like it!

Haha man, thanks. Well let’s just see how this turns out. :wink: I have high expectations of the final product so there might be a couple of iterations down the road.

This look great!

When is it going to be out???:face_with_spiral_eyes:

Nice! My hopes are quite soon (can’t really say an exact date atm), it depends a bit on how everything welds together. :slight_smile:

This last webplayer looks amazing…great job!!!

Superb work! looks amazing, really great to see stuff like this.

Thanks guys!

I got a request from nixter to support depth maps, much similar to this example. I got in the functionality along with possibility to additively add on top of another state, but I’m unsure about the actual quality. Anyhow here’s an image describing the method:

And a webplayer demo showing the depth map implementation in action (not super exiting interactivity in this one…). The two images along with the depth map is mashed down into one state (which makes it able to lerp into other states as well).

I’m thinking next up is manipulators to interact with particles. :slight_smile:

Fantastic! I can’t wait to get my hands on this and try it out!

Do you have plans work with GPU-based particle systems? For example:

TC Particles - Unity Asset Store - The Best Assets for Game Making
bajeo88’s free DX11 particle system - http://forum.unity3d.com/threads/172553-DX11-Particle-System-Free