Calculating the normal of a Triangle

Hello I’ve a simple problem but can not get it to fix it.

I have a plane in my scene where I want to display the normals of the vertexes and the normal of the center of the triangle. The normals on the vertices are displayed correctly but somehow I miscalcualting the normal of the triangles.
Here is a image. The yellow lines indicates the normal of each vertice which points to the right direction.
The blue Line indiciates the normal of the triangle from the center but somehow the lines are pointing into the wrong direction (I guess). The start Position is correct (triangleCenter) but the direction of the normal seems wrong. Any idea what I am doing wrong here? The blue lines should point from the center of the triangle in the same direction as the normals of the vertices.

In the code below the direction from the center of the triangle is drawn

 faceNormalWP = (n1wp + n2wp + n3wp) / 3;
[...]
Debug.DrawLine(centerWP, centerWP + faceNormalWP.normalized, Color.blue, 999);

Even if I try to calculate the normal via the cross product I get the same result.

faceNormalWP = transform.TransformPoint(Vector3.Cross(n2 - n1, n3 - n1)).normalized;
private void ShowVerticeNormals()
    {
        Vector3 p1, p2, p3, n1, n2, n3, center, centerWP, faceNormal, faceNormalWP, n1wp,n2wp,n3wp;
        for (int j = 0; j < triangles.Length; j += 3)
        {
            p1 = vertices[triangles[j]];
            p2 = vertices[triangles[j + 1]];
            p3 = vertices[triangles[j + 2]];

            // p1 = transform.TransformPoint(vertices[triangles[j]]);
            // p2 = transform.TransformPoint(vertices[triangles[j + 1]]);
            // p3 = transform.TransformPoint(vertices[triangles[j + 2]]);

            center = (p1 + p2 + p3) / 3;
            centerWP = transform.TransformPoint((p1 + p2 + p3) / 3);

            n1 = normals[triangles[j]];
            n2 = normals[triangles[j + 1]];
            n3 = normals[triangles[j + 2]];
            n1wp = transform.TransformPoint(n1);
            n2wp = transform.TransformPoint(n2);
            n3wp = transform.TransformPoint(n3);

            faceNormal = ((n1 + n2 + n3) / 3);

            faceNormalWP = (n1wp + n2wp + n3wp) / 3;

            // faceNormalWP = transform.TransformPoint(Vector3.Cross(n2 - n1, n3 - n1)).normalized;

            // Debug.DrawLine(centerWP, centerWP + faceNormalWP * 0.4f, Color.blue, 999);

            // Debug.DrawLine(transform.TransformPoint(p1), transform.TransformPoint(p1 + n1 * 3.4f), Color.blue, 999);
            Debug.DrawLine(transform.TransformPoint(p1), transform.TransformPoint(p1 + n1), Color.yellow, 999);
            Debug.DrawLine(transform.TransformPoint(p2), transform.TransformPoint(p2 + n2), Color.yellow, 999);
            Debug.DrawLine(transform.TransformPoint(p3), transform.TransformPoint(p3 + n3), Color.yellow, 999);
            // Debug.DrawLine(centerWP, centerWP + transform.TransformPoint(n1).normalized, Color.blue, 999);
            // Debug.DrawLine(centerWP, centerWP + transform.TransformPoint(n2).normalized, Color.blue, 999);
            // Debug.DrawLine(centerWP, centerWP + transform.TransformPoint(n3).normalized, Color.blue, 999);
            Debug.DrawLine(centerWP, centerWP + faceNormalWP.normalized, Color.blue, 999);
            Debug.DrawLine(transform.position, transform.position + transform.up, Color.cyan, 999);

            surfaceNormals[j / 3] = faceNormal;

        }

I can’t edit the thread, here the image in normal size…

Make a one-triangle mesh and debug what you have going. There’s a ton of duplicated code above.

Remember that everything in a Mesh will be in local coordinates.

You must find a way to get the information you need in order to reason about what the problem is.

Once you understand what the problem is, you may begin to reason about a solution to the problem.

What is often happening in these cases is one of the following:

  • the code you think is executing is not actually executing at all
  • the code is executing far EARLIER or LATER than you think
  • the code is executing far LESS OFTEN than you think
  • the code is executing far MORE OFTEN than you think
  • the code is executing on another GameObject than you think it is
  • you’re getting an error or warning and you haven’t noticed it in the console window

To help gain more insight into your problem, I recommend liberally sprinkling Debug.Log() statements through your code to display information in realtime.

Doing this should help you answer these types of questions:

  • is this code even running? which parts are running? how often does it run? what order does it run in?
  • what are the values of the variables involved? Are they initialized? Are the values reasonable?
  • are you meeting ALL the requirements to receive callbacks such as triggers / colliders (review the documentation)

Knowing this information will help you reason about the behavior you are seeing.

You can also supply a second argument to Debug.Log() and when you click the message, it will highlight the object in scene, such as Debug.Log("Problem!",this);

If your problem would benefit from in-scene or in-game visualization, Debug.DrawRay() or Debug.DrawLine() can help you visualize things like rays (used in raycasting) or distances.

You can also call Debug.Break() to pause the Editor when certain interesting pieces of code run, and then study the scene manually, looking for all the parts, where they are, what scripts are on them, etc.

You can also call GameObject.CreatePrimitive() to emplace debug-marker-ish objects in the scene at runtime.

You could also just display various important quantities in UI Text elements to watch them change as you play the game.

If you are running a mobile device you can also view the console output. Google for how on your particular mobile target, such as this answer or iOS: How To - Capturing Device Logs on iOS or this answer for Android: How To - Capturing Device Logs on Android

If you are working in VR, it might be useful to make your on onscreen log output, or integrate one from the asset store, so you can see what is happening as you operate your software.

Another useful approach is to temporarily strip out everything besides what is necessary to prove your issue. This can simplify and isolate compounding effects of other items in your scene or prefab.

Here’s an example of putting in a laser-focused Debug.Log() and how that can save you a TON of time wallowing around speculating what might be going wrong:

When in doubt, print it out!™

Note: the print() function is an alias for Debug.Log() provided by the MonoBehaviour class.

Your problem is this:

            n1wp = transform.TransformPoint(n1);
            n2wp = transform.TransformPoint(n2);
            n3wp = transform.TransformPoint(n3);

TransformPoint does calculate the world space “position” of the provided local space position. Position means it includes the objects offset and orientation in space. Though each of those normals would be relative to the objects pivot point and not to the individual vertices. So your normal would be pointing out of the pivot.

Your faceNormal calulation is correct, using the local space normals. However using TransformPoint doesn’t make much sense. You may want to use TransformDirection or use the rotation Quaternion of the object to rotate the normal into world space.

Keep in mind that a normal vector has no relation to a position. It’s just a direction vector. You can add it to a position when you want to visualize it.

Thank you for the detailed response.
Yes there is plenty of code duplication because I was figuring out different approaches.
The Code execution is fine. This is just a simple method call to display the normals, whichs works for the vertices on the triangles (yellow lines). The problem is that I somehow miscalculated the normal for the triangle itself.
Here is a cleaned up example which finally works (took longer than it should)

private void ShowVerticeNormals()
    {
        Vector3 p1, p2, p3, n1, n2, n3, center, faceNormal;
        for (int j = 0; j < triangles.Length; j += 3)
        {
            // Get Vertices for each triangle from the Mesh
            p1 = vertices[triangles[j]];
            p2 = vertices[triangles[j + 1]];
            p3 = vertices[triangles[j + 2]];

            // Corresponding normals
            n1 = normals[triangles[j]];
            n2 = normals[triangles[j + 1]];
            n3 = normals[triangles[j + 2]];

            // Draw line from each vertice in WorldSpace towards the normal of the vertice
            Debug.DrawLine(transform.TransformPoint(p1), transform.TransformPoint(p1 + n1), Color.yellow, 999);
            Debug.DrawLine(transform.TransformPoint(p2), transform.TransformPoint(p2 + n2), Color.yellow, 999);
            Debug.DrawLine(transform.TransformPoint(p3), transform.TransformPoint(p3 + n3), Color.yellow, 999);

            // Draw the exact same direction but instead from the vertice draw it from the center of the triangle
            // Calculate the center of the triangle to use it as the start Position for the Ray
            center = (p1 + p2 + p3) / 3;

            // Calculate the normal for the triangle via Crossproduct and its vertices, normalize it
            faceNormal = Vector3.Cross(p2 - p1, p3 - p1).normalized;

            // Blue line starts at the center of the triangle (WorldSpace) towards the triangle normal
            Debug.DrawLine(transform.TransformPoint(center), transform.TransformPoint(center + faceNormal), Color.blue, 999);
        }

One error was that I was not calculating the normals for the triangles with the vertice array but with the normals array.
However I don’t understand why this won’t work

            centerWP = transform.TransformPoint((p1 + p2 + p3) / 3);
            faceNormalWP = transform.TransformPoint(Vector3.Cross(p2 - p1, p3 - p1).normalized);
            Debug.DrawLine(centerWP, centerWP + faceNormalWP, Color.blue, 999);

but this does

            center = (p1 + p2 + p3) / 3;
            faceNormal = Vector3.Cross(p2 - p1, p3 - p1).normalized;
            Debug.DrawLine(transform.TransformPoint(center), transform.TransformPoint(center + faceNormal), Color.red, 999);

The transformation to the world space has to be done after adding faceNormal to the center coordinate.
It won’t work if both the center and the normal are already in world space.

Nevertheless case should be closed now.

Again, you used Transform point on the normal vector which makes no sense at all. A normal is not a point in space but a direction. TransformPoint would rotate and add the object’s offset to it. You then would again treat it like a direction which results in a completely wrong direction because what you actually get is a vector from the world origin (0,0,0) to the tip of the normal when placed at the objects origin. You use that position as a direction.

As I said, you should either:

  • Calculate the averaged normal from the vertex normals in object / local space and then use TransformDirection on the averaged local space normal to get a world space normal vector.
  • Transform the vertex normals into world space by using TransformDirection and then average them in world space.
  • Calculate the normal from the actual world space positions using the cross product. Since we are already in world space, the resulting vector is also in world space.
  • Calculate the local space normal from the local space corner positions using the cross product. After that use TransformDirection to transform it into world space.

Instead of TransformDirection you can also use transform.rotation to get a direction from local space into world space. Note that TransformDirection does only take the rotation into account. It is not affected by the objects scale or position. So it literally does the same as using the rotation Quaternion. On the other hand TransformVector is affected by the objects scale but not the position. So a direction vector gets rotated and scaled but not moved around. The usage of this is only necessary in very rare special cases. Usually when you have a local offset vector that you need to keep relative but transformed into worldspace. Finally there’s TransformPoint which you already used. It does transform an actual position from local space to world space. It’s affected by all 3: rotation, scale and position.

So this should work just fine:

            n1 = normals[triangles[j]];
            n2 = normals[triangles[j + 1]];
            n3 = normals[triangles[j + 2]];

            Vector3 localSpaceNormal = (n1+n2+n3).normalized;
            Vector3 worldSpaceNormal = transform.TransformDirection(localSpaceNormal);

As I said, instead of the last line where we used TransformDirection, we could have used the rotation Quaternion like this:

            Vector3 worldSpaceNormal = transform.rotation * localSpaceNormal;

Note that dividing by 3 is unnecessary since you should normalize it anyways. The average is not 1 unit long in most cases. Just think of two vectors of unit length that are 90° apart. Adding them and dividing by 2 does not give you a length of 1. The total length is the sqrt(2) (== about 1.414) and when dividing by 2 you would get about 0.707 and not 1. So normalizing is almost always necessary, at least when you need a unit normal vector.

Note that in the working case your faceNormal would be in local space and not in world space. You then use TransformPoint when drawing the line between the world space points which would of course draw the line correctly, however the normal vector is still in local space that way.

If you have trouble understanding coordinate systems and how they relate to each other, you may want to brush up your linear algebra understanding :slight_smile: Though working with coordinate systems require some spatial thinking. Some people (like my mum) have difficulties with that, at least when it comes to pure thinking envisioning. When it’s visualized it’s usually easier. So I recommend take an actual pen and paper and draw it out if you have trouble. If you want to better understand the math behind all that, I can recommend the 3b1b series on linear algebra. Of course Unity essentially abstracts away the matrix transformations through the Transform component, though it generally helps with the understanding.

Hey sorry for the late response and big thanks for the explanation. I worked through your comment, that was really helpful. There is still some deficit but most of it make sense now.

Yep and I visualised the transformation of the direction wrong.
As you said, the normal is a direction not a point.

I am already re-watching the 3b1b linear algebra serie.