Low resolution (or bad anti-aliasing) on Normal from Height?

I was playing around with blending two textures together to creating some tiles with “grout” or something between them. I figured I could make a tile map, and use that to blend between two sets of textures for base color, smoothness, normal, etc. This mostly works fine, but I found that the results of my normal look much worse than if the texture itself contained the same normal data as I’m trying to achieve.

Here’s a simple example, using Normal From Height to get a normal from a shape created by a Polygon node:

I immediately noticed that the edges seem very pixelated, compared to using a texture that has this normal data build into it. It’s almost like the Normal from Height was generating a much lower resolution than expected. Or it’s somehow bypassing any antialiasing that’s being applies to other textures.

For comparison, here’s what the same sort of thing looks like if I made it in Substance Designer, and export the normal texture as a 1K texture:

There’s still a bit of jaggedness, but the overall quality seems much higher to me.

Is AA not being applied to Shader Graph output? Is there an extra step to get it to? Or is there some other likely reason that Normal from Height appears to be giving me this result?

It is normal because Normal From Height node use ddx/ddy to calculate normal, which cause pixelated.
You can implement custom node to calculate normal, or convert heightmap to normal map.

Well, the height “map” I’m using is procedurally generated, not an actual texture. Is there some other way to get a normal map from some grayscale generated texture?

The ddx and ddy functions work on a grid of 2x2 pixel groups (called pixel quads) across the screen, so if there’s an edge that runs through the middle of a pixel quad it can calculate an approximate normal. If the edge is between two pixel quads then neither “see” it and you get no slope. Thus you get the above aliasing.

Yes. By calculating the procedural shape multiple times with small offsets and getting the resulting slopes. Ideally you want to do 4 samples, slightly left & right, and slightly up & down. But you can get away with 3 with a shared center sample and one right and on up, but the normal will be biased slightly in that case. The built in Normal From Texture node appears to do the 3 sample version.

You can look at that code for a basic idea of what to do:
https://docs.unity3d.com/Packages/com.unity.shadergraph@6.9/manual/Normal-From-Texture-Node.html
Note, that first line Offset = pow(Offset, 3) * 0.1; is a total “wtf?” magic number bit of code. I’m not really sure what it’s there for. But the hard part is knowing how much to offset by. For a texture it’s relatively easy, it’s somewhere between 0.5 and 2.0 texels depending on how sharp you want the normal. For procedural shapes you kind of have to choose how wide you want the resulting normals to be. Unity’s built in procedural shapes also don’t let you control how sharp they are, instead assuming you always want them anti-aliased down to a single screen pixel when for normals you generally want some kind of wider gradient than that. Especially since that means the “slope” is effectively changing size as you move the camera.

5 Likes

Here’s an example shader graph setup. It has two properties, a “texel width” and “normal scale”.

This is using derivatives to scale the width of the offsets so it’s at least 1 pixel wide. But the texel width is used as a way to clamp how small of an offset is used, so if you get really close it isn’t just always a single pixel width. I also use it to figure out how much to fade the normal by as you get further away. Most of this could be avoided if the shape nodes let you set the sharpness some how. As you can probably guess this also makes any procedural shape that’s more than a single node a huge pain in the rear. You either have all of those nodes duplicated over and over, or you put it all into a subgraph which you can’t preview easily since you can define a sub graph input as a UV to have sane defaults without doing a lot of back and forth node swapping in the sub graph to use a UV node when editing and remembering to reconnect the input each time.

10 Likes

I’m just always so impressed at the depth and quality of your knowledge. Thank you very much for the solid explanation, and the practical example. I’ll give this approach a try. It will be interesting to find out how well it works with more complicated tilings, like a truchet tiling pattern. Anyway, really appreciate it. Thanks.

2 Likes