Thought I'd share this odd find with Normal mapping.

Hello all. So for the past week, I’ve been working on a framework to share among my projects, and well, the time came to add normal mapping. However, I didn’t want to have too many opts in my vertex shader (reason is a special case) so I decided to try finding ways of not using tangents. (Silly, I know, but I had a special reason for this)

Anyway, so it turns out in the pixel shader, it is possible to do this and get pretty good looking normal mapping.
I call this the normal difference function, and relies on the fact that tangent space is basically a plane perpendicular to the vertex normal. There are some cases where it doesn’t look too good however.

fixed3 nd = (float3(0,1,0) - UnpackNormal(tex2D(_NormalMap, i.uv)).xzy);
half3 n = normalize(i.normal + nd);
n = normalize(n);

Now you may be asking; why is the Z and Y coordinates reversed?
This is actually because the Z axis is the up axis in a normal map, but a vertex normal in Unity appears to use Y up. Hence, all I have to do is swap the axes around, then take the normal map value away from (0,1,0) to get the difference vector. Then I normalize the actual vertex normal per-pixel and add the difference to it. Hey presto, we have a pseudo-normal-mapped vector we can play with!

I also experimented with doing the SH and reflection vector per-pixel and it works too, although I then went with a less-accurate, cheaper method for reflections (just use the normal difference to distort the reflection instead, of which works, but could look better)

Now, this isn’t as accurate as a true normal mapping implementation, but it can be a good compromise for platforms that are vertex shader constrained, such as mobile platforms.

EDIT: I also want to point out that I am using world space normals as inputs, I haven’t tested much for object space. I will post further shall I find anything new while I experiment. :slight_smile:

You’re essentially doing one part of triplanar mapping.

https://medium.com/@bgolus/normal-mapping-for-a-triplanar-shader-10bf39dca05a

If the plane is flat and aligned to the world axis the swizzle you’re doing is going to be just as accurate as real normal mapping. If you mesh isn’t perfectly flat, you can use something like the UDN, Whiteout, or RNM blends I mention in that article to get approximated that are nearly impossible to differentiate from real normal mapping.

If the texture UVs are aligned to the object, but that object is rotated, you can still get basically perfect normals by applying the same swizzle and then the object to world transform. Or, more specifically a transposed world to object transform via the UnityObjectToWorldNormal function. It works fine in the fragment shader.

I’m actually using World Space normals in my inputs as well, would be fun to experiment further with different ways of transforming the normal map. I didn’t even think of triplanar mapping, would be useful for those situations where I’m texturing a terrain. (not very often, but at some stage I did have a planet generator that used it)

Will post further findings, as yea, some objects do look a bit odd, perhaps the transform you’re talking about will work nicely. (you wouldn’t really notice while its in motion however, as the game project I have this in is a racing game)