GPU particles--stuck on the shader

Howdy folks! So here’s a big one. Perhaps some lovely guru can help, but this might be a direct call for help from Aras. Also, this is a repost. I noticed that I accidentally posted the original in Unity Support rather than Shaders

I’m so close! I have a lot of material that I can post but I want to keep the first message light. First of all, here is my goal: I’m trying to implement this CUDA SDK demo in Unity using a DX11 compute shader. Its an old demo, so maybe there’s already some sweet-awesome new particle technique I should be trying, but I knew I had access to all the source so this was a good foray into compute shaders for the first time. Thankfully I’m pretty familiar with the CUDA SDK so parsing the example code wasn’t bad. I have to admit, I’m really not sure what code is responsible for spawning more particles around the cursor, but anyway here’s what I’ve gotten so far.

By the way there’s another issue in that scene which is that the terrain doesn’t really want to have nice textures. I’m really not sure what’s going on, even the standard terrain assets textures, on the terrain that comes out of “Create Terrain” everything’s all blurry. Anyway ignore that and focus on the particles. There’s 600K of them buzzing around there based on velocities sampled from a noise texture.

Here’s the first place where I got side-tracked and unfortunately let down. I happen to have a copy of Substance Designer that I never use. I created a quick noise texture (just clouds → diffuse). At first I wanted to use Substance to spit out a 3D texture, but it looks like there’s no path for that kind of thing, in Substance or Unity. I know I can create a 3D texture by stacking those clouds textures, but it seems that there’s no way to convert a ProceduralTexture to a Texture2D in order to sample the colors from the ProceduralTexture. Feature request! Think it would be easy enough to add a GetPixel function to ProceduralTextures?

So… next step: I just created a noise texture3D with Random.value’s and use that. The issue I have now is that I need to sample this texture with interpolation in the compute shader. Oh noes! tex3D() and object.SampleLevel() don’t work! I ran into an issue getting access to the built-in Samplers and one way or another gave up there. With a clever enough hack I got the random noise working OK, as you see in the video.

Finally, though, the reason my example doesn’t look like the CUDA video is that I can’t get the darn shader to work. It’s pretty complex, and, I’ll admit, it operates on some data that I’m not providing, but the positions at least are there. The issue, I think, is that the shader takes vertex positions and runs them through a geometry program that spits out quads. At least that’s the way it works in the sample. They labeled the geometry shader a “motion blur” shader which I guess creates a quad between the current position and the previous one.

Anyway I think my issue stems mainly from one of a few factors: The first is that I’m specifying a point cloud of vertices, not triangles, while trying to render quads. The second is that the smoke is rendered using Graphics.DrawProcedural as a camera shader. This is a technique borrowed from an example I’ve been following. I adapted the example to using a MeshRenderer so that I could see the particles in the scene and generally work better with the shader and stuff, but I found that there’s a hard cap on vertex count at 65,000. Since I’m looking to use something like millions of verts, I was hoping to avoid that cap. Also, I went back and tried it as a mesh collider, with different but still unsatisfactory results.

Anyway this is what the “advanced” version of the shader looks like at the moment in Point and Triangle/Quad mode.

So, I’m happy to post some source code if this all sounds totally possible, and maybe I’m just screwing this up. I have to admit that I was a little fuzzy on exactly what transformation matrices to use in certain parts during the transcoding. I’m not exactly familiar with straight-up GLSL shaders and their parameters.

Any ideas about how I can render more than 65K particles with something other than Graphics.DrawProcedural?

Use DrawProcedural. I am lazy, but if you search there’s a link (I think even in my post below) to a link with Aras’s DX11 samples, with some DrawProcedural call to draw some line loops. Changing that to draw billboard quads is not hard. One other thing:

There is some… well… polite conversation doesn’t allow me to use words like I would like to… “not great” design regarding compute shader samplers.

Texture3D<float4> VelocityField;
SamplerState samplerVelocityField;

If you name them exactly like that, they will work. there’s another rule for custom samplers, but that naming scheme above will work. I again don’t have the link on hand but some searching will turn it up.

Another thing to keep in mind is that ARGB32 Textures don’t nearly have enough precision for fluid/smoke simulation, and end up appearing noisy and losing volume. And it seems ARGBHalf and ARGBFloat are not available for 3D textures :(. In the case of 2D textures, they’re only available on the GPU anyway (with RenderTextures).

The cap on vertices per mesh is 255 * 255 by the way. The difference of 25 probably wasn’t worth mentioning in the documentation ^^.

I think that the the easiest way to create a 3D fluid or smoke simulation, right now, is to use a slice based approach with compute shaders, essentially sampling from 3 2D Pressure and 3 2D Flow RenderTextures, for each slice output. That way you can use RenderTextureFormat.ARGBFloat too.

Or you could use one giant 2D RenderTexture representing the entire 3DTexture, dispatch groups and threads in 3 dimensions, calculate the flat index in each thread. Essentially you’d be sampling from a 3DTexture, without being limited by a lack of support for 3D RenderTextures.

[Edit] I think I’m missing the part where flow needs to be known in 6 Dimensions… Oh well, just use the extra space in the Pressure texture. (Also, I haven’t actually looked at smoke simulation implementations yet, and made some incorrect assumptions, about similarities with 2D shallow water simulation. Perhaps they only need to sample once?)

As for rendering the thing. I’d almost assume they apply some form of ray marching shader on a box around the thing, or with a full screen quad. They’re using particles? Really!? Wow. Okay :slight_smile:

You could try to render it like a huge isosurface. I think you’d still need to apply ray marching to make it look like smoke.

[Edit #3527] Actually googling for ray marching isosurface points you to this nice blog. Might be useful.

You can do a 3D floating point RenderTexture, however you can only operate on it with compute shaders. You can’t (yet) bind it to a pixel shader output… although you should be able to, and can in DX11,10, and 9. Also you can do this in a plugin, as I ended up doing.

Thanks for the responses.

I’ve posted my broken shader code in another post. I haven’t gotten to the shadow buffer yet, right now I’m just trying to implement the motion blur and pixel shader from the demo. Right now I just get a bunch of noise. I guess I need to break it down and start from scratch, but could anyone take a look-see and tell me if I’m doing something blatantly wrong?

I ended up getting through this. It was just some errors that weren’t being reported. Blast!

How is it coming?

I am creating something very similar just now using my fluid system. Initial results are promising!

I realize you managed to solve your 65000 vertex limit problem with DrawProcedural, but for anyone else interested, another way to solve this is to create multiple meshes, apply the same shader to all of them, and add an offset parameter so that ‘VertexID + Offset’ will linearly index the whole buffer.