Best way to update frequently a texture in a shader in Unity3d

Hi,
i am generating my own Perlin Noise functions and methods, different from the ones shipped with Unity. With that noise texture i use it as a heightmap for the mesh ground of an environment. So far, so good.
The problem arises when i want to update/animate the texture. If the texture is below 32*32 array i get a decent 40 FPS, but if i lift the value a bit the framerate starts to drop drammatically.
In an attempt to make it more scalable and performant i have considered two options:

  • (1) either start trying to parallelize the code in CSharp so that, for example, any “row” computation belongs to a different Thread/Task.
  • (2) generate the noise directly in the shader, thus, taking advantage of the already fit parallel architecture of the render pipeline in the GPU.

I am writing because i wanna try option (2) (even if i have little to none experience writing shaders in Cg).
So, here comes my question: i am having troubles trying to find a way to write a value from one / functions to a texture in the shader.
I am presuming i wouldnt need a two pass shader, since the computed value can be used on-the-fly in the same pass.
My problem, is that i dont know how to write/store directly to the _MainTex within the shader, without doing the math in the C Sharp/Monobehaviour script and sending it back to the Material’s _MainTex. Which seems to be the problem when i try to generate random grids greater than 64 * 64 cells.
After writing to the texture, i would use a sampler2D to get the value and apply the height to the function.
This problem can be synthesized asking the following question: ÂżBest performant way to keep a texture of a material updated with new values?

Thanks in advance,
JC

First of all, definitiely generate the noise in the shader, the bottleneck is almost certainly in transferring the texture from the CPU to the GPU.

When it comes to how to implement it, to start you could probably just generate the noise for each pixel in the fragment shader on demand, without writing it to a texture at all, but for performance reasons (not having to regenerate the data accross every onscreen fragment every frame) you would probably want to move to a two pass setup, where first a shader writes the noise to a texture, and later the texture is bound to some shader that needs it and is sampled.

Since you mentioned CG I’m going to assume you’re on the built in rendering pipeline, probably start by learning about how to blit to a texture with a custom shader and go from there. Keep in mind shader programming is totally different from CPU side, and a lot of things are done differently due to the parallel nature of everything.

Best of luck with your project.

Hi Kabinet
thanks for your reply!
Yeah, as fas as i understood, i have to “blit” the texture to a render target. After a google search i could see this. But apparently, it has to be done in CPU side, right? from a C Sharp Monobehaviour?? My doubt is still, if it is possible to write from within a shader to a float array, and pass this datastructure to a Texture2D property of the shader.
I am still looking for the right Command in Cg. And after looking into the NVidia Docs i can’t seem to find anything related.
Wondering if this is possible at all…

It’s very much possible (blitting is an incredibly common thing to do) but you’re right that you need both C# and HLSL to do it. The command to blit something does indeed come from the CPU, but the code you execute (from inside a shader) is fully GPU side. Also you won’t be writing to an array, it would be more like using a shader to write the noise value into a really low resolution texture, and just sampling that texture later when you need it. The standard shader pipeline can only really write to textures in the first place. (Though for all intents and purposes a texture is basically equivalent to a 2D array of some defined size) You can directly write to buffers on the GPU with compute shaders, but that’s probably unnecessary (and unnecessarily complex) here.

You can Graphics.Blit() to blit your noise to a texture.

You can then use Material.SetTexture() to bind your texture to the material of whatever needs to sample it.

yeah, i understand the process:

  • (1) first blit the noise texture
  • (2) call the SetTexture from the material object.

My sole and only concern is this: blitting happens in the CPU Side, so, if the noise i am generating is being created in the shader code (then GPU side), how on earth can i access that noise from the starting CPU side point (1) so to blit it???

I have seen code like this from an example but it is not working:

RenderTexture rt = RenderTexture.GetTemporary(
    TSDNoise.GetNumPointsAsInt(this.__numPoints),
    TSDNoise.GetNumPointsAsInt(this.__numPoints));
Texture2D tmpTex = new Texture2D(TSDNoise.GetNumPointsAsInt(this.__numPoints), TSDNoise.GetNumPointsAsInt(this.__numPoints));

RenderTexture.active = rt;


Graphics.Blit(null, rt, __materials[0]);
tmpTex.ReadPixels(new Rect(0, 0, TSDNoise.GetNumPointsAsInt(this.__numPoints), TSDNoise.GetNumPointsAsInt(this.__numPoints)), 0, 0);
__materials[0].SetTexture("_MainTex", tmpTex);
tmpTex.Apply();
RenderTexture.active = null;
RenderTexture.ReleaseTemporary(rt);
DestroyImmediate(tmpTex);

Since Blit is being called from the CPU side, how do i access the noise done in the shader?

I just wanna do something like:

//in the shader
float noisevalue = CalculateNoise()
_MainTex.SetColor(new Color(noisevalue, noisevalue, noisevalue, 1.0))

//in the CPU side
Texture tex = material.MainTexture;
Graphics.Blit(tex, Button.RenderTargetTexture, ...)

Please bear with me, i am a total newbie on Shader coding in Unity,
Thanks

To whoever reads this and needs a solution.
Apparently there are two ways of doing it AFAIU:

  • (1) Use a “computeShader” to generate first the texture and then inject it to the shader of your object’s material.
  • (2) use a"CustomRenderTexture".

Read a little bit about both. Think i am going for the compute shader way. This doesnt mean that all the work is done in the GPU. In order to control the ComputeShader and assign the texture you still have to do it on the CPU Side (C sharp).

Thanks

1 Like

So simple with the compute shader!.
I followed some shader setup from the book “The Bible of Shaders in Unity” (pages 314 onwards) and now i am drawing the texture as a label in the UI and as a _MainTex in the wireframe shader i am using!
This is soo cool!

To answer my original question: There is no way of writing from a vertex/geometry/fragment to a _MainTex.
So, the way i have found more comfy is using the ComputeShader class object, which is meant for GPGPU (general purpose gpu programming). Ideal to generate a noise method, inject it to the texture from a material shader, and then, from that second shader, sample the texture to be used there.

1 Like