How to remap a generated mesh's UVs based on a texture atlas in code

Hello all,

I’m combining meshes into one big mesh. This works. However, each mesh has a different material. So I combined each material into one texture atlas. This works.

However, now I have a big mesh with a texture that represents the combined materials of the original meshes gameObjects.

How do I remap the UV of my combined mesh?

Here is my code for combining and texture atlas:

 void CombineMeshes()
{
     GameObject meshContainer = new GameObject("Level");
     meshContainer.AddComponent<MeshFilter>();
     meshContainer.AddComponent<MeshCollider>();
     meshContainer.AddComponent<MeshRenderer>();

   
     GameObject[] objectsInScene = FindObjectsByType<GameObject>(FindObjectsSortMode.None);
     foreach (GameObject obj in objectsInScene)
     {
         if(obj.layer == 3 && obj.transform.parent != null && obj.tag != "Entity")
         {
             obj.transform.parent = meshContainer.transform;
         }  
     }
     //////////////////////////////make texture atlas///////////////////////////////
     List<Texture2D> listOfTextures = new List<Texture2D> ();
     int t = 0;
     while(t < meshContainer.transform.childCount)
     {
         listOfTextures.Add(meshContainer.transform.GetChild(t).GetComponent<Renderer>().material.mainTexture as Texture2D);
         t++;  
     }



     Texture2D[] atlasTextures = listOfTextures.ToArray();

     Texture2D atlas = new Texture2D(8192, 8192);
     Rect[] rects = atlas.PackTextures(atlasTextures, 2, 8192);

     print(rects.Length);
    
     Material levelMaterial = new Material(meshContainer.transform.GetChild(0).GetComponent<Renderer>().material);
     levelMaterial.mainTexture = atlas;

     ///////////////////////////////////////////////////////////////////////////////////
    
     MeshFilter[] meshFilters = meshContainer.GetComponentsInChildren<MeshFilter>();
     CombineInstance[] combine = new CombineInstance[meshFilters.Length];

     int i = 1;
     while (i < meshFilters.Length)
     {
         combine[i].mesh = meshFilters[i].sharedMesh;
         combine[i].transform = meshFilters[i].transform.localToWorldMatrix;
         meshFilters[i].gameObject.SetActive(false);

         i++;
     }


     Mesh mesh = new Mesh();
     mesh.indexFormat = UnityEngine.Rendering.IndexFormat.UInt32;
     mesh.CombineMeshes(combine, true);
     mesh.Optimize();
     //mesh.RecalculateBounds();

    


     meshContainer.transform.GetComponent<MeshFilter>().sharedMesh = mesh;
     meshContainer.GetComponent<MeshCollider>().sharedMesh = mesh;
   
    
     meshContainer.GetComponent<Renderer>().material = levelMaterial;
     meshContainer.transform.gameObject.SetActive(true);




     int m = 0;
     while(m < meshContainer.transform.childCount)
     {
         Destroy(meshContainer.transform.GetChild(m).gameObject);
         m++;
     }

     foreach(GameObject obj in objectsInScene)
     {
         if(obj.layer == 3 && obj.transform.parent == null && obj.tag != "Entity")
         {
             Destroy(obj.gameObject);
         }
     }

}

I’ve never really worked with meshes and uvs before, so this is new territory for me.

When using an atlas you can’t use certain texturing tricks, So you can not rely on the texture wrapmode since you do no longer use a single texture so texture coordinates outside the 0 to 1 range which would usually wrap around would now reach into other sections on our atlas. So you can not have a single large quad and have a certain texture “tiled” across it without a custom shader.

Generally when you use a single texture, the bottom left corner has the UV coordinate (0,0) while the top right corner has (1,1). Your vertices are mapped to some position in that range to map certain parts of the texture to the geometry. When you use an atlas you just need to remap the 0 to 1 range on each axis (U and V) to the sub area that your texture occupies in the atlas. So for example when the texture you used is located at (0.2, 0.3) and ends at (0.4, 0.5) it means it has a “size” of (0.2, 0.2) in UV space, so still a square. So every “old” UV coordinate that was in the 0 to 1 range would now need to be mapped into the 0.2 to 0.4 range for U and the 0.3 to 0.5 range for V. That’s simply done by taking the old coordinates, for example (0.123, 0.5) and multiply it by the “size” of your target tile, (0.2, 0.2) in our example here and finally add the offset / start of the tile (0.2, 0.3) in our example. So the mapped UV coordinate would be (0.2246, 0.4). So we calculated

U = LowerU + U * (UpperU - LowerU)
V = LowerV + V * (UpperV - LowerV)

This is essentially just a Lerp with the lower and higher bounds as “A” and “B” and your original coordinate would be “t”.

If you need something like “tiling” / wrapping, you either need to split your geometry into those tiles and map each to your tilte texture, or you need some kind of custom shader. Here things get a lot more complicated and requires extra information and may setup certain requirements / limitations to how the atlas can look like. When each tile in the atlas has the same size (so it’s a regular grid, like the classic minecraft atlas) you could in theory write a shader that still uses the original UV textures and you specify the actual tile index in some other vertex information like the color channel or another UV channel. The shader would essentially extract the tile index out of that information so it can calculate the bounds of the tile that should be used and do the mapping and “wrapping” directly in the shader on the fly. Of course the shader needs to know how many tiles are in the atlas (row and column count) in order to calculate the dimensions properly. Alternatively when not using a regular grid, you could store the start and end UV of the tile in two additional UV coordinates (or one float4) and do the same thing I just mentioned.

Of course using a custom shader means it’s a lot more difficult to integrate it into the usual rendering pipeline. It all boils down to your requirements and what approach makes more sense. If you just have a few textures that relly on “wrapping”, the easiest solution may be to actually tile the geometry so the UVs can be remapped statically beforehand. If your game does a lot of dynamic stuff or needs to use actual texture wrapping, a custom shader may be the way to go.

It’s hard to tell how your game and geometry is setup from that screenshot. Though since it kinda looks blocky / minecraft-like I would assume the geometry is already tiled?

If you have trouble understanding how UV mapping works in general, I made this small WebGL example that shows how the 3 visible faces of a cube are mapped to the given texture. When you “select” a face you can actually change the UV coordinate of the 4 corners and directly see the result. Note that the actual texture has a 4x4 pattern. The dimmer area around the texture is outside the 0 to 1 range and since the texture’s wrap mode is set to “wrap”, the texture simply repeats outside the 0 to 1 range. A texture is either set to “repeat” or “clamp”. Clamp would simply clamp any values outside the 0 to 1 range to that range. So any values smaller than 0 are just 0 and values larger than 1 are 1. So the outer border of the texture would be stretched. (Now that I think about it it’s a pity that I didn’t include a toggle for the wrapmode ^^).

If course when you build complex geometry in a modelling tool like blender, maya, etc… the unwrapping is usually done inside that software. “Unwrapping” is just the term of mapping each single triangle of the geometry to a certain section of the 2d texture. Since geometry is usually 3d, the triangles are usually cut at a UV seam and flattened out. Though as I said, how geometry is mapped to a texture completely depends on the texture and the geometry itself.

2 Likes

So, I need to get each uv coordinate in my new mesh and rearrange the positions? I’m a bit confused where the 0.2 comes from?

Yes, you would have to remap the UVs yourself. In my example I mapped to an arbitrary area inside the texture. Here’s a visual representation of that area:
9798351--1406337--UVMap.png
As you may be able to see, the red area starts at (0.2, 0,3) at the lower left corner and goes up to (0.4, 0.5) at the top right corner. It’s still a square area and that area has a side length of 0.2. You get that 0.2 by subtracting the lower bounds from the upper bounds. So 0.4 - 0.2 == 0.2 and 0.5 - 0.3 == 0.2

As I said, the remapping (assuming your original mesh did use 0 to 1 coordinates) you could simply use Mathf.Lerp like this:

Rect tileArea; // assuming this is your tile UVs in your atlas
Vector2 uv; // assuming this is one of your original UV coordinates in 0-1 range

Vector2 newUV;
newUV.x = Mathf.Lerp(tileArea.xMin, tileArea.xMax, uv.x);
newUV.y = Mathf.Lerp(tileArea.yMin, tileArea.yMax, uv.y);

So if the original uv value for x / u would be 0, you would map to xMin. If the original value was 1, we would map to xMax. Anything in between 0 and 1 would map accordingly to a position between xMin and xMax.

As zulu3d mentioned, there are many ready to use solutions out there. You just need to understand their limits.

If you don’t know how your mesh(es) are actually unwrapped, a long time ago I made a UVViewer that can be used inside the Unity editor to view and inspect the UV maps of a mesh. Unity now has a similar functionality built into the preview section of the Mesh. So when you inspect just the mesh in the inspector, you can switch the preview to UVLayout. It just shows the basic layout. My viewer allows you to identify individual faces in 3d space and 2d texture space. This is how Unity’s preview would show the second UV channel of the default capsule mesh:
9798351--1406346--upload_2024-4-26_18-45-24.png

1 Like

This helps a lot. I’m trying to use:

        Vector2[] UVs = mesh.uv;

        for(int u = 0;  u < UVs.Length; u++)
        {
            Rect tileArea;
            Vector2 uv = UVs[u];

            Vector2 newUV;
            newUV.x = Mathf.Lerp(tileArea.xMin, tileArea.xMax, uv.x);
            newUV.y = Mathf.Lerp(tileArea.yMin, tileArea.yMax, uv.y);
        }

But I don’t know what to set tileArea to.

And my mesh looks like this:
9798390--1406349--upload_2024-4-26_10-9-51.png

Well, this is way too small to see anything, especially with a meth that has 36k triangles. That bright square in the middle are essentially all your triangles and it looks like they are all mapped to the 0-1 range which is a good first step :slight_smile:

Well, that depends on how you create your atlas texture. You just need to know “where” in the atlas you put that texture. That “where” is your rect. Keep in mind that texture coordinates for the atlas are also in the 0 to 1 range. However since the texture you want to show for this mesh only fills a smaller portion of that whole atlas, you have to map the UV coordinates to that smaller portion. That’s your “tile area”.

Keep in mind you have to do this for each mesh you want to combine individually since each sub mesh essentially belongs to a different texture in the atlas.

If you still don’t get the “concept”, you may have another look at my WebGL toy and press the “2” or “1” button and see where the face is mapped to. Just imagine that each of the 4x4 sections in the texture are separate textures in an atlas. So when you want to map a face to that sub section, you have to change the UV coordinates of the quad corners so they are mapped to the region you want to show on the quad / trianges.

Of course more complex meshes usually have a more complex UV map. That’s why we have to map all coordinates in propotions to the bounding coordinates.

Note that the “PackTextures” method you’re using returns exactly those regions where the textures have been “packed” to. Each Rect in the array that is returned matches the corresponding texture in the textures array you pass in. So you have to remember which texture belongs to which mesh. You probably have to fix the UVs before you use CombineMeshes because after they’re combined you will have trouble to distinguish which vertices actually belonged to which mesh.

I’ve done a fair amount of fiddling and no matter how good your intentions and pre-calculations are, these things always take some trial and error.

I suggest switching between various “test texture,” such as ABCD textures like these:

8159999--1060814--abcd.png

9798663--1406403--abcd_transparent.png

You can get lots of other test UV patterns to help you figure out what is going on.

But I like the murky blue motif of whatever you got going on up there… :slight_smile:

True, but in the end he just uses Unity’s simple PackTextures packer to create the atlas, so it’s all just a matter of keeping track of the right references.

1 Like

Ok! It works!

        for (int childMesh = 0;  childMesh < meshContainer.transform.childCount; childMesh++)
        {
            Vector2[] UVs = meshContainer.transform.GetChild(childMesh).gameObject.GetComponent<MeshFilter>().mesh.uv;
           
            for(int u = 0; u < UVs.Length; u++)
            {
                Rect tileArea = rects[childMesh]; //pick the rect that is the correct UV for the original child
                Vector2 uv = UVs[u];

                Vector2 newUV;
                newUV.x = Mathf.Lerp(tileArea.xMin, tileArea.xMax, uv.x);
                newUV.y = Mathf.Lerp(tileArea.yMin, tileArea.yMax, uv.y);
             
                UVs[u] = newUV;
            }

            meshContainer.transform.GetChild(childMesh).gameObject.GetComponent<MeshFilter>().mesh.uv = UVs;


        }

Thank you! A million Thank yous! This knowledge will be very useful in the future!

This is the entire code for future reference if I ever have this problem again or anyone else does:

    void CombineMeshes()
    {
        GameObject meshContainer = new GameObject("Level");
        meshContainer.AddComponent<MeshFilter>();
        meshContainer.AddComponent<MeshCollider>();
        meshContainer.AddComponent<MeshRenderer>();

      
        GameObject[] objectsInScene = FindObjectsByType<GameObject>(FindObjectsSortMode.None);
        foreach (GameObject obj in objectsInScene)
        {
            if(obj.layer == 3 && obj.transform.parent != null && obj.tag != "Entity")
            {
                obj.transform.parent = meshContainer.transform;
            }  
        }
        //////////////////////////////make texture atlas///////////////////////////////
        List<Texture2D> listOfTextures = new List<Texture2D> ();
        int t = 0;
        while(t < meshContainer.transform.childCount)
        {
            listOfTextures.Add(meshContainer.transform.GetChild(t).GetComponent<Renderer>().material.mainTexture as Texture2D);
            t++;  
        }



        Texture2D[] atlasTextures = listOfTextures.ToArray();

        Texture2D atlas = new Texture2D(8192, 8192);
        Rect[] rects = atlas.PackTextures(atlasTextures, 2, 8192);

        print(rects.Length);
       
        Material levelMaterial = new Material(meshContainer.transform.GetChild(0).GetComponent<Renderer>().material);
        levelMaterial.mainTexture = atlas;

        ///////////////////////////////////////////////////////////////////////////////////

        for (int childMesh = 0;  childMesh < meshContainer.transform.childCount; childMesh++)
        {
            Vector2[] UVs = meshContainer.transform.GetChild(childMesh).gameObject.GetComponent<MeshFilter>().mesh.uv;
           
            for(int u = 0; u < UVs.Length; u++)
            {
                Rect tileArea = rects[childMesh]; //pick the rect that is the correct UV for the original child
                Vector2 uv = UVs[u];

                Vector2 newUV;
                newUV.x = Mathf.Lerp(tileArea.xMin, tileArea.xMax, uv.x);
                newUV.y = Mathf.Lerp(tileArea.yMin, tileArea.yMax, uv.y);
              
                UVs[u] = newUV;
            }

            meshContainer.transform.GetChild(childMesh).gameObject.GetComponent<MeshFilter>().mesh.uv = UVs;


        }





        MeshFilter[] meshFilters = meshContainer.GetComponentsInChildren<MeshFilter>();
        CombineInstance[] combine = new CombineInstance[meshFilters.Length];

        int i = 1;
        while (i < meshFilters.Length)
        {
            combine[i].mesh = meshFilters[i].sharedMesh;
            combine[i].transform = meshFilters[i].transform.localToWorldMatrix;
            meshFilters[i].gameObject.SetActive(false);

            i++;
        }


        Mesh mesh = new Mesh();
        mesh.indexFormat = UnityEngine.Rendering.IndexFormat.UInt32;
        mesh.CombineMeshes(combine, true);
        mesh.Optimize();
        //mesh.RecalculateBounds();





        meshContainer.transform.GetComponent<MeshFilter>().sharedMesh = mesh;
        meshContainer.GetComponent<MeshCollider>().sharedMesh = mesh;
      
       
        meshContainer.GetComponent<Renderer>().material = levelMaterial;
        meshContainer.transform.gameObject.SetActive(true);




        int m = 0;
        while(m < meshContainer.transform.childCount)
        {
            Destroy(meshContainer.transform.GetChild(m).gameObject);
            m++;
        }

        foreach(GameObject obj in objectsInScene)
        {
            if(obj.layer == 3 && obj.transform.parent == null && obj.tag != "Entity")
            {
                Destroy(obj.gameObject);
            }
        }

    }
}

9798864--1406478--upload_2024-4-26_15-54-29.jpg

2 Likes