I have so many models with symmetry that re-use the same texture coords for both sides of the model (mirroring). Great for textures, but now I’m trying to normal-map them.
The problem with this (just in case anyone isn’t familiar with the issue) is that you end up with one side of the model having ‘inverted’ lighting behaviour. That is, everything is fine on one side but on the ‘other’ side of the model, the bumps facing the camera go dark. It also results in big ugly seams where the two sides meet.
This is a call for solutions, as many people seem to have this problem.
Initial idea (flip the normals in the shader pipeline)
I mean, How difficult can that actually be ?
Since symmetric model are usually mirrored along one axis at 0, it makes sense that all the ‘good’ normals are on one (lets say the +x) side of the model, and the ‘inverted, nuisance’ normals are on the other (in this case, the -x side)
So, I’m assuming theres a simple way to effect the following in the shader pipeline… If the X is < 0 then invert the normal. Presumeably by normalising X to a unit value (X = -1 or +1) and using that to invert (or not invert) the broken normal peturberation by a simple operation.
I’m not sure that it is possible to know the coordinates of the model in a fragment shader though. Would this need a geometry shader too? Can a geometry shader pass a +1/-1 value to the fragment shader ?
So…
Any examples of getting local Y coordinates from inside the pixel shader ?
Any examples of a geometry shader passing additional parameters to a fragment shader ?
Any examples of a working mirroring shader - I don’t mind if it is Unity, HLSL, GLSL or even a compiled FXO, just anything to start from.
Idea 2 (Splitting the task)
If there is no way for a fragment shader to know which side of the model it is processing - how about rendering the model in two passes ? First pass is non-inverting and a simple geometry shader passes in only one side of the model - second pass is inverting and the geometry shader passes in the other side of the model.
Is that possible ? Any idea on how to have a geometry shader act as a filter ? I’m sure it can be done because I have an FXO in my collection which allows me to ‘cut away’ a mesh using a arbitrary plane.
Fallback (cheat)
Without a shader solution the simplest compromise I can see would be to switch from an XYZ normal map to a simple one channel bump map. I’ve seen software to create a 3-channel normal map from a 1 channel bump map … is there any software out there which can do this conversion in reverse ?
Not sure if this is a valid solution. Detail and accuracy will be lost and I think the shader would be slower so this is just my backup plan if all else fails. Still, it would give me something to work with while I try to come up with a true shader-based solution.
I know other people have solved this. From looking at their assets I notice that a LOT of games use mirrored normals.
So, can we work on some solutions here ? If we can come up with something that works I will happily put in the time to convert the existing shaders to various ‘mirroring’ versions for those who need them.
Any hints, tips, rumours or snippets welcome.
-Gary