# How to calculate rim light for materials with object space normal map?

Rim light is usually calculated like this:

``````half rim = 1.0 - saturate(dot(normalize(IN.viewDir), o.Normal));
``````

However it doesn’t seem to be as simple as replacing o.Normal with mine o.objNormal, probably because I don’t fully understand the math behind it and I already searched a lot in the internet to try to with no luck. The normal is already applied, I just don’t know how to work with it in the same way as I do for regular normal data.

Going by the `IN.viewDir` and `o.Normal` values, you’re asking this question in relation to Surface Shaders. There are two issues here, object space normals and wtf `IN.viewDir` actual is in a Surface Shader.

When using Surface Shaders the value you assign to `o.Normal` must be a tangent space normal that matches the mesh’s per vertex tangent data. Assigning an object space normal to that will result in some very wild lighting, so you’d need to transform the object space normals into the mesh’s tangent space for it to be usable. To do that you need to transform the object space into world space, which is simple enough.

``````half3 worldNormal = UnityObjectToWorldNormal(objectNormal);
``````

Then transform the world space into tangent space. That’s a little more complex… The “fast” approximate version looks like this:

``````// need to add some stuff to the Input struct
struct Input {
float2 uv_MainTex;
// etc, whatever else you want here
// then these last two are the important ones to add
float3 worldNormal;
INTERNAL_DATA // no trailing semi-colon
};

// then you need this function defined somewhere
// this will transform a world space normal into the mesh's tangent space
float3 WorldToTangentNormalVector(Input IN, float3 normal) {
float3 worldT = WorldNormalVector(IN, float3(1,0,0));
float3 worldB = WorldNormalVector(IN, float3(0,1,0));
float3 worldN = WorldNormalVector(IN, float3(0,0,1));
float3x3 w2tRotation= float3x3(worldT, worldB, worldN);
return normalize(mul(t2w, normal));
}

// and finally in your surf function
half3 objectNormal = tex2D(_ObjectNormalMap, IN.uv_MainTex).xyz * 2.0 - 1.0;
half3 worldNormal = UnityObjectToWorldNormal(objectNormal);
o.Normal = WorldToTangentNormalVector(IN, worldNormal);
``````

So that covers the object space normal part. The second part is the `IN.viewDir`. Now, the obvious answer is it’s the view direction. However if you don’t assign the `o.Normal` in a Surface Shader the `IN.viewDir` is in world space. If you do assign the `o.Normal` then `IN.viewDir` is in tangent space! So, the good news is once you’ve added the code I presented above, your line will magically work again.

The alternative, if for some reason you didn’t want to assign the object space normal map, or anything else to the `o.Normal`, use the world space normal calculated from the object space normal and do:

``````half rim = 1.0 - saturate(dot(normalize(IN.viewDir), worldSpaceNormal));
``````

If you want to assign something else to `o.Normal` and only use the object space normal map for the rim effect, then you’ll need to calculate the world space view direction on your own. Add `float3 worldPos;` to your `Input` struct and then do:

``````half3 worldViewDir = normalize(_WorldSpaceCameraPos.xyz - IN.worldPos);
half rim = 1.0 - saturate(dot(worldViewDir, worldSpaceNormal));
``````

Now, as I mentioned, the `WorldToTangentNormalVector` function is only an approximation. It might be good enough most of the time. To calculate the real world to tangent matrix is a bit more expensive. But if it’s a problem for you I go over the solution here:

``````float3 WorldToTangentNormalVectorExact(Input IN, float3 normal) {
float3 worldT = WorldNormalVector(IN, float3(1,0,0));
float3 worldB = WorldNormalVector(IN, float3(0,1,0));
float3 worldN = WorldNormalVector(IN, float3(0,0,1));

half3x3 w2tRotation;
w2tRotation[0] = worldB.yzx * worldN.zxy - worldB.zxy * worldN.yzx;
w2tRotation[1] = worldT.zxy * worldN.yzx - worldT.yzx * worldN.zxy;
w2tRotation[2] = worldT.yzx * worldB.zxy - worldT.zxy * worldB.yzx;

half det = dot(worldT.xyz, w2tRotation[0]);
w2tRotation *= rcp(det);

return normalize(mul(w2tRotation , normal));
}
``````

Guess what, my normal map seems to be actually in World Space, not Object, they look almost the same, that’s awful. So, my guess is that I just need to do this instead:

``````half3 worldNormal = tex2D(_WorldNormal, IN.uv_MainTex).rgb;
o.Normal = WorldToTangentNormalVector(IN, worldNormal);
half rim = 1.0 - saturate(dot(normalize(IN.viewDir), normalize(o.Normal)));
o.Emission = _RimColor.rgb * pow (rim, _RimPower);
``````

It doesn’t look correct though, sides and back view is lit with just the rim color, not smoothing out on the edges, did I miss something? The model looks good with just the normal map.

The difference between object space normal maps and world space normal maps is whether or not you apply the object’s transform matrix to the data before you use it. Otherwise they are the same.

Hmm. Not sure. You might try manually calculating the world space view direction and using that instead of using the built in one and see if you get better results.

Tried to calculate the world view direction with no luck, looks the same, shoot.

``````half3 albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
half3 worldNormal = tex2D(_WorldNormal, IN.uv_MainTex).rgb;
half3 worldViewDir = normalize(_WorldSpaceCameraPos.xyz - IN.worldPos);
half rim = 1.0 - saturate(dot(worldViewDir, worldNormal));

o.Albedo = albedo.rgb * _Color;
o.Normal = WorldToTangentNormalVector(IN, worldNormal);
o.Emission = _RimColor.rgb * pow (rim, _RimPower);
``````

By the way, my model is using world space normals because I need to blend the character face with the rest of the body, it’s a separated object.

Is the `_WorldNormal` texture a render texture, or texture2D asset? And what format is it in? If it’s a texture on disk you’re importing and using, that code isn’t going to work since textures are only 0.0 to 1.0 range values per component, and normals generally need to be between -1.0 to 1.0.

The simple fix to that, and how normal maps traditionally handle it, is to multiply the color value by 2 and subtract 1.

``````half3 worldNormal = tex2D(_WorldNormal, IN.uv_MainTex).rgb * 2.0 - 1.0;
``````

This assumes the original normals were also encoded with the opposite mapping, `rgb = worldNormal.xyz * 0.5 + 0.5;`.

It’s an asset, DDS format. Thanks for the info, it still having the same issue with rim light thought.

There’s any way to check it? The problem must be my normal map, since I already tried everything.

Make sure the sRGB (Color Texture) option is checked off.

I noticed that the texture is like a “mirror”, it’s applied to one side of the model and the other side uses the same texture, you think it could be the problem?

Uh … yeah. Those can’t be world or object space normal maps then, because both of those essentially require unique UVs per face. The only exception would be for repeated objects within the mesh, but even then they’d never appear mirrored.

I kinda fixed it in a very simple way, just flipping the red channel in the mirrored side. I get the info of the mirrored vertices in the vertex colors red channel, so:

``````normal.r *= (IN.color.r > 0) ? -1 : 1;
``````

It seems to work fine for some angles but not all of them, I’m almost there, still missing something. There should be a way to flip all the channels in a way that works. It’s broken when I rotate the model.

Are the meshes themselves mirrored? I’m not sure I really understand the specifics of your use case here. Usually world or object space normal maps are used for things like models created through photogrammetry, or complex terrain sculpts. They’re usually highly unique like real world objects are.

“World space normal maps” only work if the mesh is completely static and never rotated from the orientation the normals where baked from. If you want to rotate the mesh, then you’d need to treat them as object space normal maps.

Can you show an example of the kind of assets you’re working with? And why are they dds source?

I’m studying shaders with some assets I got from a game. It’s a character with face and body being separated models, with separated animations, causing a visible seam between the meshes, which I guess is fixed by the normal map. The normal data by the colors seems to be an world or object space normal map. In fact, with the code I posted, it produces exactly the results I’m expecting, but as you mentioned, I’m not able to rotate it. In object space the meshes are not blending together.

I’ll show you some pics and the normal file(I converted it to PSD to be able to send, the format makes no difference):

My custom rim shader so far (nor seam, but I can’t rotate the character):

My custom rim shader with object space normal (visible seam):

Vertex colors only:

5608030–580417–normal.psd (511 KB)

Oh boy…

In the early days of normal maps no one had figured out the “right” way to use them (or at least settled on a common way to use them), so almost every game implemented them in completely different ways. Some people do weird experiments with normal maps for certain kinds of effects, but generally today you use tangent space normal maps using MikkTSpace tangents and compatible shaders. Object based normal maps are becoming more popular again for use with photogrammetry, especially since it allows for fairly clean mesh LODing without loosing as many visual details on the kinds of complex objects you usually get with photogrammetry.

This is doing something completely different than what most modern games would do. If it works how I think it does, I haven’t seen anyone use this style of normal mapping since the original Xbox era of games. You’re going to have a real bear of a time getting this to work in Unity because nothing built in is going to handle it properly.

I think the closest you’ll get is if you manually override the mesh’s vertex normals and tangents to store the initial “world space” orientation and then let Unity’s built in “tangent” to world space normal map transforms use those values. That’d make skinned meshes work. You’d have to modify the meshes from c# to do this.

Well, that’s awful, but I was expecting something like this. Anyway, this topic probably provided a good amount of information about normal maps, at least for me it does. How you think it works thought? Also, did you think it’s possible to hide the seams only through shaders?

And move stuff? no.