How can I draw an array of data (640x480) very fast ?

Hello all,

I have an array of values from a Cellular Automata simulation (640x480 grid) that I would like to draw straight to screen. My understanding is that these are the only options available:

  • Make a Texture2D on every frame then draw it as a GUI texture and unfortunately take a bad hit on the framerate because of Apply().
  • Draw it as particles but that many particles (307200) would kick the framerate in the face.
  • Draw it as openGl calls but there's no GL_POINTS and it would be slow anyway.
  • Draw it as an Image Effect/Graphics.blit but passing the data to the shader would have to also be as a Texture2D so same problem with Apply().

So... how can I draw/blit all that 2D pixel data the fastest way possible?

ANY trick, option or alternative would be most welcome. Thanks in advance.

A little more advanced, but you could also rewrite the cellular automata simulation to evaluate inside a shader. This would be done by applying the shader on a fullscreen quad. If you need information about the previous state of the cells when evaluating, you could this by making two rendertexture's and ping-pong between the two. Something like this:

  • Set rendertextureOne active
  • Set rendertextureTwo as a texture in the material with the cellular automata shader
  • Draw fullscreen quad with the cellular automata material
    • Use information from rendertextureTwo to calculate next step

---- Next frame ----

  • Set rendertextureTwo active
  • Set rendertextureOne as a texture in the material with the cellular automata shader
  • Draw fullscreen quad with the cellular automata material
    • Use information from rendertextureOne to calculate next step

If you can't fit all the information into one texture, you can always use more of these ping-pong's. This technique would definitely speed up your calculations quite a bit, given they are applicable to this kind of parallel processing.

It is actually possible, on OSX, to blit textures very fast by having a plugin access the OpenGL texture "handle", and write the data directly from there.

This has been used successfully for putting high-resolution quicktime movies into Unity.

The big issue is that it doesn't work on Windows, because Unity is using Direct3D there, and it appears to be quite complex to update D3D textures from plugins. Any ideas or pointers welcome.

You could divide that into a grid of smaller textures, and then update only the textures that actually change.

i'd just use your first idea and update the texture every 0.1s or so in a coroutine - it should still look plenty good enough without killing your frame rate

Write a shader!