Symmetric models with mirrored normal maps - Shader Fix?

I have so many models with symmetry that re-use the same texture coords for both sides of the model (mirroring). Great for textures, but now I’m trying to normal-map them.

The problem with this (just in case anyone isn’t familiar with the issue) is that you end up with one side of the model having ‘inverted’ lighting behaviour. That is, everything is fine on one side but on the ‘other’ side of the model, the bumps facing the camera go dark. It also results in big ugly seams where the two sides meet.

This is a call for solutions, as many people seem to have this problem.

Initial idea (flip the normals in the shader pipeline)

I mean, How difficult can that actually be ?

Since symmetric model are usually mirrored along one axis at 0, it makes sense that all the ‘good’ normals are on one (lets say the +x) side of the model, and the ‘inverted, nuisance’ normals are on the other (in this case, the -x side)

So, I’m assuming theres a simple way to effect the following in the shader pipeline… If the X is < 0 then invert the normal. Presumeably by normalising X to a unit value (X = -1 or +1) and using that to invert (or not invert) the broken normal peturberation by a simple operation.

I’m not sure that it is possible to know the coordinates of the model in a fragment shader though. Would this need a geometry shader too? Can a geometry shader pass a +1/-1 value to the fragment shader ?

So…

Any examples of getting local Y coordinates from inside the pixel shader ?
Any examples of a geometry shader passing additional parameters to a fragment shader ?
Any examples of a working mirroring shader - I don’t mind if it is Unity, HLSL, GLSL or even a compiled FXO, just anything to start from.

Idea 2 (Splitting the task)

If there is no way for a fragment shader to know which side of the model it is processing - how about rendering the model in two passes ? First pass is non-inverting and a simple geometry shader passes in only one side of the model - second pass is inverting and the geometry shader passes in the other side of the model.

Is that possible ? Any idea on how to have a geometry shader act as a filter ? I’m sure it can be done because I have an FXO in my collection which allows me to ‘cut away’ a mesh using a arbitrary plane.

Fallback (cheat)

Without a shader solution the simplest compromise I can see would be to switch from an XYZ normal map to a simple one channel bump map. I’ve seen software to create a 3-channel normal map from a 1 channel bump map … is there any software out there which can do this conversion in reverse ?

Not sure if this is a valid solution. Detail and accuracy will be lost and I think the shader would be slower so this is just my backup plan if all else fails. Still, it would give me something to work with while I try to come up with a true shader-based solution.

I know other people have solved this. From looking at their assets I notice that a LOT of games use mirrored normals.

So, can we work on some solutions here ? If we can come up with something that works I will happily put in the time to convert the existing shaders to various ‘mirroring’ versions for those who need them.

Any hints, tips, rumours or snippets welcome.

-Gary

hello,
maybe, (if your model is really symetrical) you can think of using Half the geometry. and compute the otherside in the shader.

(assuming that the half model is in X+ part)
for each vertex you can compute:

  • the normal and reflected normal (in object space so -x)
  • the reflected Position (in object space so -x)

compute WorldSpace position for each vertex and reflected vertex
do the magic computation for normal map (tan,Bi…)

then for each pixel you can compute:

  • the original pixel color with normal and RGB normalMap
  • the mirrored one with reflected normal + (-R)GB normalMap

they did this on a car racing game i worked on. (working on Half the Hull then they computed the other side inGame)

hope that helped.

This sounds like something you should be able to fix once in the mesh, before you ship your game, rather than once per vertex per frame for the entire duration of rendering.

I don’t think this problem has anything to do with flipped normals. Mirroring geometry produces correct normals. What happens is that the tangents get flipped, and point in the wrong direction–they point as if the texture were mapped to the inside of the model. The tangents need to be inverted on the mirrored geometry.

That’s not all, though: the bitangent calculated from a mirrored normal would point in the correct direction, even though the tangent would be wrong. Fixing (inverting) the tangent results in an inverted bitangent. Luckily, Unity handles this when it calculates the bitangent:

float3 binormal = cross( v.normal, v.tangent.xyz ) * v.tangent.w;

The bitangent is computed as the cross product of the normal and tangent, as one would expect with regular normal mapping. However, the result is multiplied by the W component of the tangent. This is the key to getting a correct bitangent: tangents on mirrored geometry must have -1 in their W component (as opposed to the usual +1).

If your tangents get inverted as homogeneous vectors, their W component should already be -1. If they are inverted as 3-component vectors, then the -1 will have to be added afterwards.

Unfortunately, I don’t know what words 3D modelling packages use to describe these things, or to what level the whole business is abstracted. You’ll have to figure out how to get your 3D modelling package to output inverted homogeneous tangents for mirrored geometry. The result will be a properly normal mapped, mirrored model drawn with one regular material and no extraneous computation in the vertex shader.

Crytek and Epic support mirrored normal maps at the engine level. I’m interested to see how this works in unity though.

I’m pretty sure that “at the engine level” means they invert bitangents when the tangent has been inverted, just like Unity.

Also, in case it’s confusing anyone, I just noticed that Unity’s code incorrectly calls it a binormal. This is just an error in nomenclature, though: bitangents and binormals are the same vector.

Ah, yes. I think I am using the wrong terminology. The meshes normals are correct… it is the normal map which gets applied badly in texture space due to the flipped tangents.

Does anyone know how I’d repair that in the mesh ? I have 3DS Max, Blender and DAZ3Ds Hexagon.

Is the code you posted something which should appear in the shader (my code), or something unity does for me (behind the scenes) - Sorry, had problems following along.

So… if I loop through the mesh … for each X Y Z W, I would need to set W to a magnitude of 1 and adopt the sign of the chosen axis (such as X)

Ah, theres the problem then. I believe the mesh is using three component vectors.

Wow, that is much better response than others I’ve read elsewhere which tend to range from - you cant unless your engine supports -to- it’s bad practice to mirror the textures in the first place.

Can anyone provide info on applying these kinds of changes to the mesh ? Meanwhile I’ll stare at Blender for a while - since if anything can do this it’s probably blender.

-Gary

anyone has a solution for this? i have the same problems with flipped tangents :frowning:

Yes, turned out to be quite simple in the end. I went back to my modelling software and exported the model again, but this time in .fbx format - making sure to tell the exporter to save the normal and tangent/bitangent data into the mesh. This worked really well - but I had to change normals and tangents from ‘calculated’ to ‘import’ in the mesh’s importer section within unity.

Only one model had a slight ‘lighting seam’ but the bumps were correct on both sides of the model. There are tutorials out there for repairing lighting seams, and a little fading in one channel of the normal map and the seam dissappeared.

The models all look fantastic now…

I cant understand why this is such a big mystery in other forums, with people saying things like “This problem is difficult even for the big software houses” … pah! Use a format that stores the tangent frame with the mesh and you’re golden!

I’ve just done it in XNA too. Process was considerably different but the end result was the same : )
So don’t be put off by all those threads out there saying you cant/shouldnt mirror normal maps : )

Anyway, Hope that helps.

-Gary

I mirror by scaling the transform by a negative value. In that case recalculating the tangents on the mesh with the tangent.w negated does the trick. I use the script below. Note the second argument ‘mirrored’. Just pass that as ‘true’ for mirrored meshes, false otherwise.

using UnityEngine;
using System.Collections;



/*
Derived from
Lengyel, Eric. “Computing Tangent Space Basis Vectors for an Arbitrary Mesh”. Terathon Software 3D Graphics Library, 2001.
http://www.terathon.com/code/tangent.html
*/

class TangentSolver
{
	public static void Solve( Mesh theMesh , bool mirrored )
	{
		int vertexCount = theMesh.vertexCount;
		Vector3[] vertices = theMesh.vertices;
		Vector3[] normals = theMesh.normals;
		Vector2[] texcoords = theMesh.uv;
		int[] triangles = theMesh.triangles;
		int triangleCount = triangles.Length/3;
		Vector4[] tangents = new Vector4[vertexCount];
		Vector3[] tan1 = new Vector3[vertexCount];
		Vector3[] tan2 = new Vector3[vertexCount];
		int tri = 0;
		
		for ( int i = 0; i < (triangleCount); i++)
		{
			int i1 = triangles[tri];
			int i2 = triangles[tri+1];
			int i3 = triangles[tri+2];
			
			Vector3 v1 = vertices[i1];
			Vector3 v2 = vertices[i2];
			Vector3 v3 = vertices[i3];
			
			Vector2 w1 = texcoords[i1];
			Vector2 w2 = texcoords[i2];
			Vector2 w3 = texcoords[i3];
			
			float x1 = v2.x - v1.x;
			float x2 = v3.x - v1.x;
			float y1 = v2.y - v1.y;
			float y2 = v3.y - v1.y;
			float z1 = v2.z - v1.z;
			float z2 = v3.z - v1.z;
			
			float s1 = w2.x - w1.x;
			float s2 = w3.x - w1.x;
			float t1 = w2.y - w1.y;
			float t2 = w3.y - w1.y;
			
			float r = 1.0f / (s1 * t2 - s2 * t1);
			Vector3 sdir = new Vector3((t2 * x1 - t1 * x2) * r, (t2 * y1 - t1 * y2) * r, (t2 * z1 - t1 * z2) * r);
			Vector3 tdir = new Vector3((s1 * x2 - s2 * x1) * r, (s1 * y2 - s2 * y1) * r, (s1 * z2 - s2 * z1) * r);
			
			tan1[i1] += sdir;
			tan1[i2] += sdir;
			tan1[i3] += sdir;
			
			tan2[i1] += tdir;
			tan2[i2] += tdir;
			tan2[i3] += tdir;
			
			tri += 3;
		}
		
		for (int i = 0; i < (vertexCount); i++)
		{
			Vector3 n = normals[i];
			Vector3 t = tan1[i];
			
			// Gram-Schmidt orthogonalize
			Vector3.OrthoNormalize(ref n, ref t );
			
			tangents[i].x  = t.x;
			tangents[i].y  = t.y;
			tangents[i].z  = t.z;
		
			// Calculate handedness
			tangents[i].w = ( Vector3.Dot(Vector3.Cross(n, t), tan2[i]) < 0.0f ) ? -1.0f : 1.0f;
			if ( mirrored ) tangents[i].w *= -1; 
		}		
		
		theMesh.tangents = tangents;
	}
}

Use something like this to recalculate a mesh’s (and its mirror’s) tangents:

TangentSolver.Solve( mesh , false );
TangentSolver.Solve( mirrorMesh , true );
1 Like

I use the same procedure as GaryC to import my mesh its always worked for me. I bake my normal maps with Xnormal using the obj format. When I do I only bake half the mesh then fold over the uv’s when I get back into max. I also use the fbx format to go from max to Unity.

Many company’s have had problems with Binormals and Tangents, My old company included because up Until at least Max 2008, Max’s exported Binormals and Tangents were not quite orthogonal. To make matters worse the format Max used to display in its real time view port was different to what it exported and rendered.

This lead to all sorts of crazy workarounds and the strange rules you hear that individual studios use to get around the issue. I knew one guy who use to manually align the border uvs to be perpendicular and then flatten the details on the seams.

Follow Garys advice and avoid old versions of max if you are going to bake your maps. (by the way I don’t actually know if Autodesk have fixed this issue yet).

“- but I had to change normals and tangents from ‘calculated’ to ‘import’ in the mesh’s importer section within unity.”

How do you do this?

—edit —

Never mind. Found it. Thanks.

Did anyone already tried to invert the green channel in the normal map?

Inverting the green (X) channel would break as much as it fixed for models with mirrored geometry. The only situation where it would help is if all of your model’s tangents were inverted. In this case, simply inverting your model’s tangents would give the same result.

So that’s mean we have to re-edit our model again in another 3D software? Btw is this case only for Unity or every engine has this kind of problem?

If your model file doesn’t have tangent information, you’ll have find some way to get it. Either re-export it with tangents, or calculate your own in Unity. Both methods are detailed earlier in this thread.

Engines either provide support for mirrored tangents, or they don’t. Unity does, but that doesn’t mean it solves the problem on its own. You still have to provide correct data. As far as I can tell, the problem is with people’s model files, their modelling software, or their export settings.

Ah I see, thanks for the explanation :smile:

I know this is an old thread, but for anyone else having this problem:

Check that your Normal textures are set to “normal map” in the inspector, and make sure “Generate from greyscale” is unchecked. This did the trick for us without having to rework our models, or re-export with special settings.

Heya,

Ok, I know its an old thread… but I just tried to texture a model with many UV’s inverted sharing the same texture (great for Modular Modeling!!!) But… once again, I got stuck with normals not displaying correctly. I’ve done the trick of exporting as an fbx and all the binormal and tangent info, and then making sure Unity Imports Tangents as well. This hasn’t helped.
Any new tricks on how to deal with this? This would help a lot!

Thanks!

Nick.

Mirroring Normals in the Games industry is quite a regular practice, and it frustrates me that trying to do so in Unity gives me issues. As I mentioned, I’m trying to do a few models using a Modular Technique (You save a huge bunch of texture space by overlapping and inverting UV’s to create all sorts of panels that interconnect with each other). But all this is worthless if the engine doesn’t support inverted UVs (or mirrored UVs… whatever it is called).
I’ve been searching the forums with no satisfactory answer. If there is a Shader that solves the issue, I would really love to know about it (even if it is payable).

I hope someone can help me on this… can’t continue working at the current state :frowning:

Thanks

Nick.

If you read the thread, you’ll see it explained that Unity supports mirrored normal maps, and exactly how this support is implemented mathematically. The missing link so far appears to be that specific model formats do not contain homogeneous tangents, or they are not imported correctly. If you provide some information about your specific setup, you might get some help.