Normal Mapping not working correctly

Hey Guys,
I’m in the process of writing a Cell Shading Uber Shader and I constantly return to this problem.
I think I’m doing something in my normal calculations wrong, as no weather how high I export my normal map (16bit no dithering) I still get pixelated shadows, they look worse than per-vertex normals.
Here are the values i use:

//In the vertex shader
o.worldNormal = normalize( mul( float4(v.normal.xyz, 0.0), unity_WorldToObject ).xyz );
o.tangent = normalize( mul( unity_ObjectToWorld, float4(v.tangent.xyz, 0.0)).xyz );
o.binormal = normalize( cross( o.worldNormal, o.tangent ) * v.tangent.w);

//In the fragment shader
float4 tangentNormal = tex2D(_NormalMap, i.uv.xy);
float3 normalMap = UnpackNormal(tangentNormal);  normalMap = normalize(normalMap);
       
float3 localCoords = normalMap.rgb;
       
float3x3 TBN = float3x3(i.tangent, i.binormal, i.worldNormal);
float3 normal = normalize( mul( localCoords, TBN ) );

Also here an image of the result of adding a normal map exported from Substance Painter.
It seems to be generally correct, it’s just pixelated at the edges, am I overlooking something in the normal map calculations or is it an issue with something else?

Thanks for the help in advance!

  • Lukas

By default Unity will convert any Normal map to a compressed DXT5 texture. If you disable compression, it’ll use RGBA32 (8 bit per channel). This is true even if you use a 16 bit .tif or .png as the source image. What you’re seeing is likely an artifact of that. If you have a 16 bit per channel .tif, you can force it to use a 16 bit per channel format in Unity by changing the texture type to Default, disabling sRGB, and going into the platform overrides for the texture and setting it to use RGBAHalf.

But that still might not fix the issue you’re seeing entirely. A texture is a grid of values. Bilinear interpolation will make something like a color texture look a bit smoother, but the values are still being interpolated linearly. The result is all normal maps will have artifacts like this if you sharpen them to this extreme.

Ok, thanks for the info, I actually tried just that beforehand, as I was speculating that this might be the issue, as I had problems with 8bit textures and unity reading them as 16bit, sadly this didn’t improve the normals at all.
But if I’m limited by the texture resolution, do you think interpolating the normals by their surrounding UV normal pixels might give a smoother result?

Depending on what version of Unity you’re using, their support for 16 bit per channel formats is limited. Basically you have to use an .exr file for anything older than 2020.2. They only just fixed support for 16 bit .png and .tif files being imported to 8 bit, so using those source file formats and setting the imported texture to RGBA Half won’t be any different than using 8 bit to begin with. At some point they did add RGBA Half as an override option for normal maps, so you don’t actually need to set the type to Default in that case. I think that was fixed in some version of 2019.

It’ll give a different result, sure. “Smoother” is subjective.

You could implement something like this, which is cheap and will soften it a bit, but won’t be great.
https://www.iquilezles.org/www/articles/texture/texture.htm

There are also a whole long list of different kinds of bicubic / quadratic filtering you could look at which would help too.

My personal take though is to just not use a hard edge on the lighting. Sharpened, sure but not a completely hard edge.