Particle Systems, sorting, draw order and performance

Dear all,

I’m sweating a little over how to display some sensordata in a sensible manner these days. The sensor data are basically 2D images that make up cross-sections of a body of matter inside a container, and my task is to recreate, with 3D geometry, the objects that the sensor detected. The assignment is totally analogous to the tomographical cross sections generated by a CAT scanner in hospitals, and the algorithms that subsequently reconstruct the organ that was scanned.

I’m aware that 3d geometry from cross sections is commonly created using marching cubes and edge detection, but the time I’ve been given doesn’t allow me to implement this algorithm properly just yet, so I’ve opted for something simpler to begin with: particles. To fake volumetric reconstruction from the tomograms, I’ve instantiated collections of particles ordered in a slice of space aligned with the position of the 2D image at the point in space where it was detected. I then run through all the particles in the slice and give them a color corresponding to the pixel color they map to in the tomogram. I made a little sketch (all hail Power Point) to try and illustrate the idea. Imagine you run this sensor past a pyramid that lies down on its side; the resulting tomograms might look like those in the image. Then, I color grids of particles like the ones below. When placed close together and given a bit of jitter, they can fake the reconstruction of the pyramid enough to be satisfactory:

alt text

Obviously, in the application, there are far more particles per slice; 4096, to be exact. To render them, I have a single particle system with a MeshParticleEmitter attached, whose mesh property is null and whose Emit property is disabled. Its animator has the “Does Animate” property set to false as well. Then, every update in a script, I just call MeshParticleEmitter.Emit a number of times with the particle positions and the color in that pixel, so I have full control over the colors and the positions particles end up at. This actually works pretty good, especially since I was able to multithread the instantiation and destruction of new slices and particle positions, so it doesn’t impact the framerate.

Now, for the problem:

Particles aren’t depth sorted. So when you view the slices from one of the sides, you frequently see particles belonging to slices further away drawn on top of particles that belong to slices closer to the camera. This means you could look at the slices from the far right and down “through” the slices, and not see a large green square with a red border, like you’re supposed to, but a constant green particle in the middle (because all slices have a green particle in the middle) and then a mishmash of flickering red/green particles because there’s no telling what slice just happened to Emit a particle first at any given time. So how do I fix that?

Are particles belonging to different emitters sorted with respect to their emitter? I.e., what if I use one particle system per slice instead of one overall particle system for all slices? Would the slices then at least be depth sorted with respect to one another (but still not individual particles in the same slice, of course)?

To be visually optimal, I need to display a total of about 16.000 particles at a time. Can I depthsort those with bubble sort with any kind of reasonable performance, like Daniel Brauer suggested in this question? And if I do try that, what would I sort? The emitter’s array, i.e. this one? Are particles rendered in the order they appear in that array, so that I would want closer particles to have higher indices in that array, so that they overdraw the ones further away?

…Is there an easier way to accomplish this?

If you made it this far, thanks a lot for taking the time to read this incredibly verbose question. (Brevity doesn’t seem to be my strong side) :wink:

I’m not sure if I understand exactly what you’re doing, but if the particle slices were generated by different objects, you could use the renderer.material.renderQueue to sort the slices. Higher values are rendered over the lower ones, thus you should decrement the renderQueue value with increasing distance from the camera (typical values: 2800 - 3000).

This could solve partially the problem - the slices would be drawn in the correct order, but inside each slice the particles would remain unordered. If observed at nearly perpendicular angles, particles in the same slice could superpose each other randomly.

A complete solution would require sorting all particles, as you’ve said - but this is very time consuming! The bubble sort algorithm could become a nightmare - it’s acceptable only for small arrays: the average time spent is proportional to N2, so big arrays will take an indecent amount of time. There’s a better algorithm called Quicksort: it’s based on a binary division, and takes a time proportional to N*log(N). Besides these two, you can find several other algorithms in this article.