Memory leak with procedural generation

Hi, I’m creating a procedurally generated world with chunks.
In my Update method, I start a coroutine for each new chunk :

private void Update()
    {
        if (newChunkList.Count != 0)
            foreach (Chunk chunk in newChunkList.ToList())
                if (threadCount < maxThreads)
                {
                    newChunkList.RemoveAt(newChunkList.IndexOf(chunk));
                    StartCoroutine(LoadChunk(chunk));
                }
     }

And in this coroutine, I start a thread in which the chunk is generated :

IEnumerator LoadChunk(Chunk chunk)
    {
        threadCount++;

        bool done = false;
        Thread thread = new Thread(() =>
        {
            chunk.CreateShape();
            done = true;
        })
        {
            Priority = System.Threading.ThreadPriority.BelowNormal
        };

        thread.Start();

        while (!done)
            yield return 0;
         
        threadCount--;
        chunk.UpdateMesh();
    }

Here is the code from the Chunk class :

public class Chunk : MonoBehaviour
{
    public int size, seed;
    public int[,] chunkData;
    public Vector3 pos;

    public FastNoise noiseBase, noiseCraters, noiseColor;

    private List<Vector3> vertices = new List<Vector3>();
    private List<int> triangles = new List<int>();

    MeshFilter meshFilter;
    MeshCollider meshCollider;
    MeshRenderer meshRenderer;
    Mesh mesh;

    private void Awake()
    {
        meshFilter = GetComponent<MeshFilter>();
        meshCollider = GetComponent<MeshCollider>();
        meshRenderer = GetComponent<MeshRenderer>();
        mesh = new Mesh();
    }

    public void CreateShape()
    {
        vertices.Clear();
        triangles.Clear();

        for (int x = 0; x < size + 1; x++)
        {
            for (int z = 0; z < size + 1; z++)
            {
                float ncx = (x + pos.x + seed) / size;
                float ncz = (z + pos.z + seed) / size;
                chunkData[x, z] = Mathf.RoundToInt(noiseBase.GetNoise(ncx, 0, ncz) * 90);
                chunkData[x, z] = -Mathf.Max(chunkData[x, z], Mathf.RoundToInt(noiseCraters.GetNoise(ncx, 0, ncz) * -50) + 30);
            }
        }

        for (int x = 0; x < size; x++)
        {
            for (int z = 0; z < size; z++)
            {
                int index = x * size + z;

                vertices.Add(new Vector3(x, chunkData[x, z], z));
                vertices.Add(new Vector3(x, chunkData[x, z], 1 + z));
                vertices.Add(new Vector3(1 + x, chunkData[x, z], z));
                vertices.Add(new Vector3(1 + x, chunkData[x, z], 1 + z));

                triangles.Add(0 + (index * 4));
                triangles.Add(1 + (index * 4));
                triangles.Add(2 + (index * 4));
                triangles.Add(2 + (index * 4));
                triangles.Add(1 + (index * 4));
                triangles.Add(3 + (index * 4));
            }
        }
    }

    public void UpdateMesh()
    {
        mesh.Clear();

        mesh.vertices = vertices.ToArray();
        mesh.triangles = triangles.ToArray();

        mesh.Optimize();

        mesh.RecalculateNormals();

        meshFilter.mesh = mesh;
        meshCollider.sharedMesh = mesh;
        meshRenderer.enabled = true;
    }
}

When I build the game, the memory usage keeps increasing as I walk and generate new chunks.

Here are the results from the profiler :

I notice that some variables have increased in size but i can’t find out where.

Here is a screenshot from the memory profiler : Imgur: The magic of the Internet

The first snapshot is at the start without moving and the second snapshot is 3 minutes of chunk generation later.

Does anyone have an idea where this memory usage comes from ?

Thank you

Do you have an actual memory leak or do you just have a lot of garbage generated? In C# it’s actually impossible to produce an actual memory leak. So maybe you create some tracked objects like Meshes or Materials and you don’t destroy them once no longer used?

Apart of that you should get rid of this:

yield return 0;

and replace it with

yield return null;

Your line would allocate garbage with each yield while the second does not. An IEnumerator “enumerates” object values. So yielding a value type like an int means that value will be boxed (allocated on the heap) in order to get yielded as object. Unity’s coroutine scheduler will treat any value that does not have a special meaning as “wait one frame”. Whenever you want to wait one frame, you really should use null as it does not allocate garbage.

We can’t really tell if there are other issues since the code you posted doesn’t really contain much. As I said, maybe you don’t clean up tracked UnityEngine.Object derived objects or if you use some kind of pooling / caching system, that may be flawed. The Profiler should give you some insight where memory is allocated.