I’m generating some grayscale textures with code and need to convert them to normal maps (in a script) for use with bumped shaders, has anyone tried that? I’d like to do it in the Update function so it needs to be pretty fast. The grayscale step does not need to be stored as a texture if that’s slower than built in arrays i guess.
A good normal map filter is called “Sobel filter”, it basically takes derivatives in both directions with a 3x3 filter and you get the normal out of that.
A simple way is: for each pixel, take difference in heights horizontally and vertically. Imagine them as vectors, i.e. horizontal to next pixel is (1,0,deltaHeightHorizontal) and vertical to next pixel is (0,1,deltaHeightVertical). These vectors lie on the “plane of the surface” at that pixel. Now doing a cross product of those will give the normal.
Finally, encode the normal so that each component -1.0 = 0, 0.0 = 128, +1.0 = 255.
All this can be done on the GPU of course, by sampling the heightmap and outputting the normal map into the render texture.
My first attemt was kind of successful. Just checking grayscale difference of neighboring pixels and generating red+green values for the color from that. Problem is the ‘bumpiness’ isn’t very high so the effect is not very visible. (Left: generated, Right: Converted by Unity)
At the moment I just want to output the normal color to see how it works, but all I get is the “normal map blue” (0.5,0.5,1). I have a alpha heightmap in _MainTex, 128x128px, and the fragment program looks like this:
Metervara did you figure out why the normal map generated in your first example was so subtle? I’ve experimented with a javascript version here and is working well, but as you posted with your example above its a very subtle effect.
Did you find a way to amplify the effect at all? I hope you dont mind me sampling a bit of your code.
Cheers
AaronC
var bumpTexture : Texture2D;
var bumpSource : Texture2D;
var x: float;
var y: float;
var xLeft: float;
var xRight: float;
var yUp: float;
var yDown: float;
var yDelta: float;
var xDelta: float;
function Start () {
bumpSource=gameObject.renderer.material.GetTexture("_BumpMap");
bumpTexture = new Texture2D (bumpSource.width, bumpSource.height, TextureFormat.ARGB32, false);
for (y=0; y < bumpTexture.height; y++) {
for (x=0; x < bumpTexture.width; x++) {
xLeft = bumpSource.GetPixel(x-1,y).grayscale;
xRight = bumpSource.GetPixel(x+1,y).grayscale;
yUp = bumpSource.GetPixel(x,y-1).grayscale;
yDown = bumpSource.GetPixel(x,y+1).grayscale;
xDelta = ((xLeft-xRight)+1)*0.5;
yDelta = ((yUp-yDown)+1)*0.5;
bumpTexture.SetPixel(x,y,new Color(xDelta,yDelta,1.0,1.0));
}
}
bumpTexture.Apply();
gameObject.renderer.material.SetTexture("_BumpMap",bumpTexture);
}
Sorry to bump old post but I found a solution to enhance this effect and get the result of the “create normal from grayscale” in UNITY importer.
First I added an S-curve on the detals to control more the sharpness of the normal map created in the function. I find it good when the distortion varies between 10 and 20 !
GetPixel is very slow if you use it for every pixel, and for every pixel multiple times!
Here is a similar function with less GetPixel calls (only once/column):
public static Texture2D CalculateNormal(Texture2D sourceImage, float strength = 1f)
{
Mathf.Clamp(strength, 0.0f, 10.0f);
int w = sourceImage.width;
int h = sourceImage.height;
Texture2D res = new Texture2D(w, h);
Color[] current_col = null;
Color[] l_col = null;
Color[] r_col = null;
float sample_l, sample_r, sample_u, sample_d;
float x_vector, y_vector;
for (int x = 0; x < w; x++)
{
//Left column = previous column
if (current_col != null) l_col = current_col;
//Current column = right column
if (r_col != null) current_col = r_col;
else current_col = sourceImage.GetPixels(x, 0, 1, h);
//If left column null, take current column (if x=0)
if (l_col == null) l_col = current_col;
//Right column = next column (everytime except the last run)
if (x < w - 1) r_col = sourceImage.GetPixels(x + 1, 0, 1, h);
else r_col = current_col;
for (int y = 0; y < current_col.Length; y++)
{
//Read pixel (left,right,up,down) from current
sample_l = l_col[y].grayscale * strength;
sample_r = r_col[y].grayscale * strength;
if (y < h - 1) sample_u = current_col[y + 1].grayscale * strength;
else sample_u = current_col[y].grayscale * strength;
if (y > 0) sample_d = current_col[y - 1].grayscale * strength;
else sample_d = current_col[y].grayscale * strength;
//WebGL integer overflow on new Color() without Mathf.Clamp01()
x_vector = Mathf.Clamp01((((sample_l - sample_r) + 1) * 0.5f);
y_vector = Mathf.Clamp01((((sample_d - sample_u) + 1) * 0.5f);
Color col = new Color(x_vector, y_vector, 1f, 1f);
res.SetPixel(x, y, col);
}
}
res.Apply();
return res;
}
Because Unity’s grayscale normal map algorithm does not work with tilemaps (visible edges between tiles because of its algorithm), I had to do it manually. sumpfkraut’s algorithm allows for seamless tiling, but its edges are not as sharp against transparent areas. So I modified it slightly.
public Texture2D CalculateNormal(Texture2D sourceImage, float strength)
{
Mathf.Clamp(strength, 0.0f, 10.0f);
int w = sourceImage.width;
int h = sourceImage.height;
Texture2D res = new Texture2D(w, h);
Color[] current_col = null;
Color[] l_col = null;
Color[] r_col = null;
float sample_l, sample_r, sample_u, sample_d;
float x_vector, y_vector;
for (int x = 0; x < w; x++)
{
//Left column = previous column
if (current_col != null) l_col = current_col;
//Current column = right column
if (r_col != null) current_col = r_col;
else current_col = sourceImage.GetPixels(x, 0, 1, h);
//If left column null, take current column (if x=0)
if (l_col == null) l_col = current_col;
//Right column = next column (everytime except the last run)
if (x < w - 1) r_col = sourceImage.GetPixels(x + 1, 0, 1, h);
else r_col = current_col;
for (int y = 0; y < current_col.Length; y++)
{
if (current_col[y].a < 0.05f)
{
res.SetPixel(x, y, new Color(0.5f,0.5f,1f,1f));
continue;
}
//Read pixel (left,right,up,down) from current
sample_l = l_col[y].grayscale * strength;
if (l_col[y].a < 0.05f)
{
//sample_l = current_col[y].grayscale;
sample_l = 0f;
}
sample_r = r_col[y].grayscale * strength;
if (r_col[y].a < 0.05f)
{
sample_r = 0f;
}
if (y < h - 1)
{
sample_u = current_col[y + 1].grayscale * strength;
}
else
{
sample_u = current_col[y].grayscale * strength;
}
if (y > 0)
{
sample_d = current_col[y - 1].grayscale * strength;
}
else
{
sample_d = current_col[y].grayscale * strength;
}
//WebGL integer overflow on new Color() without Mathf.Clamp01()
x_vector = Mathf.Clamp01((((sample_l - sample_r) + 1) * 0.5f));
y_vector = Mathf.Clamp01((((sample_d - sample_u) + 1) * 0.5f));
Color col = new Color(x_vector, y_vector, 1f, 1f);
res.SetPixel(x, y, col);
}
}
res.Apply();
return res;
}
Bumping this because someone here might be able to help me, also trying in Unity Answers.
I’m calculating curvature map from a height map in a shader, for small gradients it works nicely but for larger gradients it creates jittering artifacts.
the first step in this shader is using a 3x3 Soble filter to create the normal from the height map, at this point you can see that with large gradients it starts showing the jittering artifacts, which leads to the artifacts in the curvature map as well.
is there a way of getting a smoother normal maps? without passing the normal map through another blur filter? am I missing something in my implementation?
I’ve also tried with different sample distanced for the Soble filter, some help a little bit, but no specific distance solves this issue
The filtered texture is a 1024x1024 render texture - ARGB64 (16 bit per channel)