GPGPU calculations(general purpose computing) using shaders

Hello all,

Has anyone tried to write a computational expensive algorithm with shaders in a unity game?

We are looking into high performance simulation algorithms and might need to use a GPU. It would be good to know if we can get information out of the Graphics card with Unity shaders, I assume so given we can use cG to write a shader,

Thanks,

Yes it’s possible but you’d need Unity Pro in order to use Render To Texture:

It should make GPGPU programs much more straightforward using Unity than setting up a OpenGL/DirectX framework.

Thanks,
We are using renderTexture already.
Now I need to get CUDA support for Unity :~)

You don’t necessarily have to use CUDA you can do everything using shaders “the old way” (like the first GPGPU programs, before CUDA was out).

Just create a shader that applies a certain computation for each texel. Once done there are several ways to get back the result:

  • download the texture from the GPU to the CPU and read (and sum or multiply or whatever) every texel… easy but slow
  • create a “reduction” shader that will read a texel and use the same function (summation, multiplication, …) to the three adjacent texel (thus reducing your texture from say 512512 to 256256 to 128128 etc. to 11).

The second method is not so easy but it works. I used it several times.
Also doing GPGPU computations using this method (applying a computation for every texel) works great. Two years ago I did a Pathfinding on the GPU using this method and it worked Ok (I did that as a hobby project so I didn’t fully implemented it). You just have to rethink an algorith enough so that he doesn’t have any dependencies with his neighbors.

But maybe you already knew all that stuff :stuck_out_tongue:

Thanks,
I am trying to offload some physics functions on to the GPU.