About merging meshes and their normals

Hello all,

This is my first question here ever. Sorry for the long post!

The situation: I have a room model that can have up to 4 possible doors (one for each side). When there is no door at the side, I’d like for the model to look closed there. Using my modelling program of choice (blender), I made an extra model which is a wall and closes exactly the gap as it should. This model uses a different area of the same UV texture.

As you might have already guessed, lighting there does look odd, in a way that makes it obvious the extra wall is a different object. I was expecting that. Merging the shapes improves it somehow, but contrary to my expectations you can still distinguish it even after recalculating the normals. I have even tried using a model of the base room without the faces that would overlap, with the same results.

Now, if I combine the models via Blender and remove the double vertices, the model looks perfect. Since the room is symmetric in 4 sides, assuming there are no rooms with no doors (because that’d make no sense) and assuming I can rotate but not mirror a mesh, there are really only 6 possible combinations. This isn’t much at all; only +4 ~10-20kb models. The programming beast within me, however, growls at this solution, because there could be a time in the future when this solution would not be enough, since I plan to have different environments in the future.

So, what causes this issue, and how could I fix it? Merging the vertices sounds like a good idea, but won’t that screw up the UVs? I assume that when blender exports, it separates a vertex into many if the UV coordinates are different for each face they belong to. But if that’s true, then the imported room should be technically identical to the generated one, as it contains exactly the same vertices at exactly the same positions. One possible explanation is that there’s some kind of very minor float precision problem which somehow causes the common vertices of the two combined meshes to not actually be identical.

Finally, in-editor solutions (if any) will not do because I’m using my own custom level editor and so everything is created during runtime in the beginning of each level.

~ Zoodinger

Hello Usul,

I actually fixed the problem a long time ago. The issue was that Unity uses a completely different method to calculate normals during import than during runtime. Runtime only fixes the normals of vertices shared by many faces, but it doesn’t take into account other vertices that are in the same position. This especially happens when the model has UV coordinates, in which case the vertices must be different (as a vertex can’t have different UV coordinates for different triangles).

Edit: I finally wrote an article about it that explains in detail what is going on. I also provided a working solution. I’m updating this, even though it’s a very old question, because I know people will still find it useful.

The article is at: http://schemingdeveloper.com/2014/10/17/better-method-recalculate-normals-unity/

Hi Zoodinger and welcome!

I expect that the reason they look different is because of the normals. You should put a shader on that visualizes this here’s one..

  • Edit the normals in blender and fix it there(as far as I remember this is not possible).
  • Try and change the import settings on the models and make Unity recalculate normals instead of import them.
  • As a programmer you can manually replace/modify any/all normals through the Mesh class and “fix” them that way.
  • You can also add very small verts to the edges and use Mesh.RecalculateNormals.

but start by visualizing it :slight_smile: hope my suggestions help!