Hi,
I’ve been trying to use the blit functionality to do some texture generation on the GPU (I have been using the CPU, but it is far to slow!).
When attempting to blit with a material specified I get corruption ocuring. This happens with the built in shaders as well as shaders I have written myself.
I have written an example script that shows the issue:
using UnityEngine;
using System.Collections;
public class BrokenTest : MonoBehaviour {
void Start ()
{
Texture2D smallTexture = new Texture2D( 32,
32,
TextureFormat.ARGB32,
false );
RenderTexture largeTexture = new RenderTexture( 256,
256,
0 );
largeTexture.isPowerOfTwo = true;
largeTexture.Create();
FillTexureWithNoise( smallTexture );
Graphics.Blit( smallTexture, largeTexture, new Material(Shader.Find("Diffuse") ) );
Material materialTest = new Material (Shader.Find("Diffuse"));
materialTest.SetTexture( "_MainTex", largeTexture );
renderer.material = materialTest;
}
private void FillTexureWithNoise( Texture2D noiseTexture)
{
for( int x = 0; x < noiseTexture.width; x++ )
{
for( int y = 0; y < noiseTexture.height; y++ )
{
float value = Random.value;
noiseTexture.SetPixel(x, y, new Color( value, value, value, value ) );
}
}
noiseTexture.Apply();
}
}
What I do is I generate a small texture and fill each pixel with a random noise. I then upscale the texture using the GPU to a larger RenderTexture.
The specific line with issue is:
Graphics.Blit( smallTexture, largeTexture, new Material(Shader.Find("Diffuse") ) );
If I remove the ‘new Material’ and use whatever default the blitter uses it works properly.
Here is a comparison of what setting a material vs not setting a material (not setting a material results in the correct output).
Bad (mmmm random framebuffer values):
Good:
Is there something I am not doing to the material to ensure that it is valid? Is there a special material I should be using? If you want to test the script out just attach it to a default cube / sphere. It will set up all the materials ect.