Complete Information on Mesh Scripting

I can create meshes from scratch, unity basically gives you all you need. However I find it difficult to texture and such, I don’t quite understand all the elements of a mesh. I tried to google this information, but nothing specific really came up. Also I just want to be clear, that I don’t want to be told how to texture what I make. I want to understand what I am doing, so I can do it myself.

My question is can anyone explain the elements (what exactly are normals, tangents etc…) or point me in the direction of a good resource to get all this information? Preferably one source for all of the information (usually keeps things more consistent making it easier to understand.)

Thanks in advance.

My usage pattern for making my own meshes in Unity is this:

Make some lists: (the code below is just typed, sorry for errors)

verts = new List<Vector3>();
tris = new List<int>();
uvs = new List<Vector2>();

Then I iterate and fabricate the geometry.

Finally I get the mesh and fill it up:

mesh = gameObject.GetComponent<MeshFilter>().mesh;
mesh.vertices = verts.ToArray();
mesh.triangles = tris.ToArray();
mesh.uv = uvs.ToArray();

And finally I call .RecalculateNormals() on the mesh.

Use that setup and experiment, just tinker with a simple quad (two triangles) and see how changing the values changes what you see, and it will soon be apparent what is going on. You will learn more than any explanation you read anywhere around. :slight_smile:

Sorry your google doesn’t work. Try:

Normal. If you compute the normal for each triangle in a mesh, and then combine the normals at all the shared vertices (so, if six triangles all meet at a vertex, add the triangle normals together, then divide by 6, then normalise.)

Tangent. A point on a sphere doesn’t have a tangent. It can have lots of tangents. So often two tangents are computed, using texture coordinates to define two axes.

UV mapping.

Thanks Graham. I usually tend to avoid wikipedia when I am looking for a clear understanding of something. To put it a better way I find it to have too much information clustered together, making it difficult to follow or Just not the information I am looking for. So I try to find other sources that may describe it layman’s terms for me but still in depth. Or at least something broken down a bit better with images. I know this may sound silly but I learn better with images than I do pure text.

Anyway lets see my understanding so far after reading the wiki

Normal:

  • used to determine a color based on the angle of the normal to a light source (known as flat shading
  • used for for phong shading (make a faltter object look more round)

Basically it’s a number used for multiple types of calculations involving it’s orientation?

Tangent: As far as I can tell is just used to calculate the normal? I can see why you mentioned spheres don’t have one specifically since every tri would have a different one if I understand this correctly.

UV: This one still eludes me. Take a Quad for example. (I’ll use 2D for this example)

h = height, w = width
Vectors:
(-w, -h)
(w, -h)
(-w, h)
(w, h)

and UV’s:
(0, 0)
(1, 0)
(0, 1)
(1, 1)

respectively.
The image shows up correctly, but if I swap UV’s like to be like
(1, 0)----swapped
(0, 0)----swapped
(0, 1)
(1, 1)

I get some real funky results. I was hoping I could get a better explanation as to how the UV maps a standard image to a Quad. What is each UV coord actually doing? This would allow me to texture more intricate meshes.

Thanks.

Yep. This is correct.

I bet! You’ve taken just one side of the quad, the top side, and swapped the UVs there. What a mess that must be!

Picture it this way: the texture is an infinitely stretchable sheet that you pin to the geometry of the quad. The UV coordinates tell it what part of the texture to pin to each vertex. In the original mapping, the top-left corner of the quad is pinned to the top-left corner of the texture (U=0 means left side of the texture; V=0 means top of the texture). The top-right corner of the quad is pinned to the top-right corner, and so on. Everything nice and neat.

But then you took and swapped the UV coordinates on the top side. So now, the top-left corner of the quad is pinned to the top-right corner of the texture, and vice versa. If you also swapped things on the bottom, your texture would simply be backwards. But you didn’t, so now the texture is stretched and squashed, because it’s flipped on top but not on bottom.

But there’s nothing mysterious going on here… the UV coordinates simply tell Unity: for each vertex in the mesh, what part of the texture should be there? And these texture coordinates go from 0 to 1, both horizontally and vertically. (You can go outside this range, and the texture will either wrap or be pinned to whatever color is on the edge, depending on the material settings.)

HTH,

  • Joe

Oh I see so it’s 0 to 1, I didn’t realize it was a normalized value. So if I understand correctly having a rectangle with 6 vertices the 2 middle verts would be (0, 0.5) - left side and (1, 0.5) - right side to have the texture stretch across the entire mesh right?

Yep, I think you’ve got it!