# Procedural Mesh and Normals - CLOSED - Solved

Hi guys,
Im trying to import an object at runtime in Unity. To do that, I must read its text line by line, and build it inside Unity triangle by triangle.
I’ve managed to create the mesh and assign the uv’s so far but calculating the normal maps its been a real headache.

In the attached image, you can see the object on the left appears nice, with no problems. That object was imported to unity manually (drag and drop from the desktop).
The one on the right looks messy and sloppy. This is the one that I create with code.

I’m not 100% sure the problem is just the normals but I’m open to sugestions of what the problem may be and/or how to calculate the normals.

There is the built-in method Mesh.RecalculateNormals, but this may not work if you have unique vertices for each triangle.
There are many ways to calculate normals yourself, for example:

• Set vertex normal to surface normal of the triangle it is part of.
• Set vertex normal to an average of all the triangles that vertex shares.
It’s hard to tell if your issue is because of normals or uv coordinates, since you don’t share what kind of shader the model is rendered with.
1 Like

both objects are rendered with default shaders. Don’t know if they are different from each other but its very likely.
I didn’t write any code related to shaders, so my created object must be assuming a default one.

Mesh.RecalculateNormals() gives me a very bad result, so I ended up calculating the normals ‘by hand’.
In order to calculate the normals, I iterate each triangle, use theyr 3 points to get a plane, and then calculate a perpendicular vector to that plane. That perpendicular vector will be my normal. Is this correct?

Sounds good. This will only yield flat normals, though. In order to smooth you will need to average and align vertices occupying the same space.

I have no ideia of what you mean by that

``````    static void CalculateNormalsManaged (Vector3[] verts, Vector3[] normals, int[] tris)
{
for (int i = 0; i < tris.Length; i += 3) {
int tri0 = tris[i];
int tri1 = tris[i + 1];
int tri2 = tris[i + 2];
Vector3 vert0 = verts[tri0];
Vector3 vert1 = verts[tri1];
Vector3 vert2 = verts[tri2];
// Vector3 normal = Vector3.Cross(vert1 - vert0, vert2 - vert0);
Vector3 normal = new Vector3()
{
x = vert0.y * vert1.z - vert0.y * vert2.z - vert1.y * vert0.z + vert1.y * vert2.z + vert2.y * vert0.z - vert2.y * vert1.z,
y = -vert0.x * vert1.z + vert0.x * vert2.z + vert1.x * vert0.z - vert1.x * vert2.z - vert2.x * vert0.z + vert2.x * vert1.z,
z = vert0.x * vert1.y - vert0.x * vert2.y - vert1.x * vert0.y + vert1.x * vert2.y + vert2.x * vert0.y - vert2.x * vert1.y
};
normals[tri0] += normal;
normals[tri1] += normal;
normals[tri2] += normal;
}

for (int i = 0; i < normals.Length; i++) {
// normals [i] = Vector3.Normalize (normals [i]);
Vector3 norm = normals[i];
float invlength = 1.0f / (float)System.Math.Sqrt(norm.x * norm.x + norm.y * norm.y + norm.z * norm.z);
normals[i].x = norm.x * invlength;
normals[i].y = norm.y * invlength;
normals[i].z = norm.z * invlength;
}
}
``````

My crappy method from my own mesh implementation as builtin one can’t into multithreading. This should replicate unity’s builtin normal generation. Uses manual inlining as it boosts performance by quite a bit (commented code is the ‘naive’ approach).

It could of course be an issue with your triangle generation, but let’s hope it’s not that.

1 Like

That doesn’t look that crappy at all.Though if each triangle has its own set of unique vertices this will not yield any smooth looking surfaces, like the RecalculateNormals method built into the runtime. The editor importer has more sophisticated algorithms to recalculate normals based ona amoothing angle.

It didn’t do anything for me. My normals stay the same with or without this code

shrug it’s not the normals then

Let me upload my library. Maybe you could take a look at the code. Give me a few mins.

This is my library-in-construction, that I use to create my 3D objects.
In order to use it, just copy the folders inside an empty unity project and type on a start method:`GameObject example = Load_DXF.Load("C:/my_path_example_location/example.dxf", 100);`

Let me know what you think and where/how I can improve this.

2296191–154482–Example.rar (194 KB)

So, I had a quick look.
The main issue you have, which also renders the normal calculating algorithm @Zuntatos shared ineffective, is that you reuse vertices to create back facing polygons. Some of the manually created triangles that close the drawing are also reversed.
In my opininion, the first thing you should do is stop creating backfaces, and make sure all triangles you generate are aligned. In your example file, you will need to reverse the order on the inside extrusion.
Here’s a simple shader that lets you visualize the normals on a mesh:

``````Shader "Custom/Normal"
{
Properties
{
}
{
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag

struct vertexInput
{
float4 vertex : POSITION;
float3 normal : NORMAL;
};
struct vertexOutput
{
float4 pos : SV_POSITION;
float3 normal : TEXCOORD0;
};

vertexOutput vert(vertexInput input)
{
vertexOutput output;
output.normal = input.normal;
output.pos = mul(UNITY_MATRIX_MVP, input.vertex);
return output;
}

float4 frag(vertexOutput input) : COLOR
{
return float4((input.normal * 0.5) + 0.5, 1);
}
ENDCG
}
}
}
``````

There’s probably some things I overlooked, but there’s the result I was able to achieve. Modified file below.

1 Like

Going to start analysing your changes but at first glance, it looks like you saved my life.
You said my main issue was that I was reusing vertices? I though that was the proper way of creating a mesh, and would let me save up on performance.

No, the issue is not that you reuse vertices directly, you should where ever possible. Note vertex properties like normals, uvs, colors or tangents have a one to one ratio. There’s only ever one of these per vertex. Consider two triangles built using only 3 vertices, essentially one triangle and an opposite. You can only have 3 normals because you only have 3 vertices, but you would actually need 6 normals in order for smoothing to render properly. This means you can’t share vertices between front and backfacing triangles, because they need their own unique normals to render properly. This also means you can’t share vertices where you need a sharp edge between triangles. Take a smooth cube for example, needs only 8 vertices and 12 triangles to render, if you want sharp flat normals on the cube, you still only have 12 triangles, but you need 24 vertices.

1 Like

now I see what you mean. That explains alot of stuff!
Thanks for the help everyone.

1 Like

I also have this “problem” - at render-time I need to visualize the face/triangle/polygon with 1 normal, not the 3 normals…

Wouldnt it be possible to compute 3 unique normals foreach triangle in a shader? - I cannot find anything about it with google so far…

I can use 4x more vertices, and have separate vertices foreach triangle… but that is more expensive to render