Is there a way to blend textures into a material based on a noise map?

I have a plane with an associated material and a randomly generated noise map as well as an array of 3 textures (64x64).

I want to associate 1 pixel in the noise map with a 64x64 stretch on the plane.
Depending on the value of that pixel (0, .5, or 1) I want to assign one of the 3 textures [A, B, and C] to that grid on the plane. In between those values, I want the textures to be a blend of the next texture (for instance, a value of .25 on the noise map would be a 50/50 blend of texture A and B, and .75 would be a 50/50 blend of texture B and C. You get the idea…)

Would it be possible (not too resource intensive), to just create an array of colors based on the above rules and then setpixels it to the plane’s material’s texture, or is there a quicker way to do it?

I’ve seen people using shaders to accomplish things similar to this, but I’ve only seen shaders used to compare with the y coordinate of the point in space…

There are many ways to blend two textures? What blend mode do you want?


You might find that a simple opacity blend would look odd at 50% of each but if thats all you want to do then surely its just a matter of sampling your noises normalised pixel value… say for arguments sake its 0.75.


Then you use that 0.75 as the opacity value for one layer and you use 1 - 0.75 i.e. 0.25 as value for the other.


regarding:

"

Would it be possible (not too resource intensive), to just create an array of colors based on the above rules and then setpixels it to the plane’s material’s texture, or is there a quicker way to do it?"


You wouldn’t need to store anything all the data is stored in the textures already you just need to evaluate it based on the texture coordinates/ pixel data.