Problems with Advanced Mesh API to optimize Vertex Attributes

The idea is to optimize some meshes data at runtime using the Advanced Mesh API. But I’m having issues after I try it. The Mesh ends up distorted, and I’m getting an error:

SkinnedMeshRenderer: Mesh has been changed to one which is not compatible with the expected mesh data size and vertex stride. Aborting rendering

This is what I’m doing:

  • Clone the original mesh
  • Get its VertexAttributes by using GetVertexAttributes()
  • Loop thru each existing attribute
  • If it is Position, Normal, Tangent, or BlendWeight, I set its format to Float16 and the dimension to 4.
  • If it is TexCoord0, I set its format to Float16 and dimension to 2
  • If it is BlendIndices, I change its format to UInt16
  • Set mesh data using mesh.SetVertexBufferParams()
  • Set the mesh to the Skinned Mesh Renderer.

If I remove the code that modifies Position, Normal and Tangent Attributes, I don’t get the distortion, and the other Attributes get optimized. So, I think that I’m doing something wrong with those…

The mesh looks like it was successfully optimized.

But in the scene, I have that error I wrote at the beginning, and the mesh is distorted:

And, here is the code:

private Mesh Optimize(Mesh original)
        {
            var mesh = Instantiate(original);
                var attributes = mesh.GetVertexAttributes();
                for (int i = 0; i < attributes.Length; i++) {
                    var attribute = attributes[i];

                    if (attribute.attribute == VertexAttribute.Position
                        || attribute.attribute == VertexAttribute.Normal
                        || attribute.attribute == VertexAttribute.Tangent
                        || attribute.attribute == VertexAttribute.BlendWeight
                        )
                    {
                        attribute.format = VertexAttributeFormat.Float16;
                        attribute.dimension = 4;
                    }

                   
                    if (attribute.attribute == VertexAttribute.TexCoord0) {
                        attribute.format = VertexAttributeFormat.Float16;
                        attribute.dimension = 2;
                    }

                   
                    if (attribute.attribute == VertexAttribute.BlendIndices) {
                        attribute.format = VertexAttributeFormat.UInt16;
                    }
                   
                    //Save modified attribute
                    attributes[i] = attribute;
                }
               
                mesh.SetVertexBufferParams(mesh.vertexCount, attributes);
               
                return mesh;
        }

Btw, I also tried using the mesh.SetVertexBufferData(), but it usually doesn’t do anything else, and sometimes I get more errors with it.

Thanks for reading all these!

@MartinTilo
@richardkettlewell

Y’all are always really great at helping out, and I appreciate it. This problem has been an issue we have been trying to solve off and on for a couple months now. Can you help or do you know who can?

Just changing the VertexAttributeDescriptors but not changing the underlying vertex data to the format you’re specifying will cause the GL to interpret the old vertex data with your new (and mismatching) descriptor.

you’ll have to define your new vertex format

    [System.Runtime.InteropServices.StructLayout(System.Runtime.InteropServices.LayoutKind.Sequential)]
    public struct CustomVertex
    {
      public Half posX, posY, posZ, posW;
      public Half normX, normY, normZ, normW;
      public Half tanX, tanY, tanZ, tanW;
      public Half u, v;
    };

Half needs to be a struct that implements An IEEE-754 16bit half float value Small Float Formats - OpenGL Wiki (.Net 5 has native support for them as System.Half, but there are open source implementations available)

Then you need to create an array of them and set each attribute to the correct value and finally set this array using Unity - Scripting API: Mesh.SetVertexBufferData

Thank you so much for your help and time @mabulous .

So, I tried a few things but I couldn’t make it work…
First, I tried with Unity.Mathematics half type. But it didn’t work. So, I searched for an open source Half, (we are using Unity 2020.3.19f, which doesn’t have .Net 5), which didn’t work either.

What I did was to make the struct as you said, but I made three because the Stream for Position, Normal and Tanget is 0, the TexCoord0 is 1, and the BlendWeight and BlendIndices is 2. (I get this from the Vertex Attributes)

[System.Runtime.InteropServices.StructLayout(System.Runtime.InteropServices.LayoutKind.Sequential)]
        public struct VertexData0 {
                public Half posX, posY, posZ, posW;
                public Half normX, normY, normZ, normW;
                public Half tanX, tanY, tanZ, tanW;
  
                public VertexData0(Mesh mesh, int index) {
                    var vertex = mesh.vertices[index];
                    posX = HalfHelper.SingleToHalf(vertex.x);
                    posY = HalfHelper.SingleToHalf(vertex.y);
                    posZ = HalfHelper.SingleToHalf(vertex.z);
                    posW = HalfHelper.SingleToHalf(0f);

                    var normal = mesh.normals[index];
                    normX = HalfHelper.SingleToHalf(normal.x);
                    normY = HalfHelper.SingleToHalf(normal.y);
                    normZ = HalfHelper.SingleToHalf(normal.z);
                    normW = HalfHelper.SingleToHalf(0f);
                  
                    var tangent = mesh.tangents[index];
                    tanX = HalfHelper.SingleToHalf(tangent.x);
                    tanY = HalfHelper.SingleToHalf(tangent.y);
                    tanZ = HalfHelper.SingleToHalf(tangent.z);
                    tanW = HalfHelper.SingleToHalf(tangent.w);
                }
            }
[System.Runtime.InteropServices.StructLayout(System.Runtime.InteropServices.LayoutKind.Sequential)]
            public struct VertexData1 {
                public Half uvX, uvY;
  
                public VertexData1(Mesh mesh, int index) {
                    uvX = HalfHelper.SingleToHalf(mesh.uv[index].x);
                    uvY = HalfHelper.SingleToHalf(mesh.uv[index].y);
                }
            }
[System.Runtime.InteropServices.StructLayout(System.Runtime.InteropServices.LayoutKind.Sequential)]
            public struct VertexData2 {
                public Half blendWeightX, blendWeightY, blendWeightZ, blendWeightW;
                public ushort blendIndex0, blendIndex1, blendIndex2, blendIndex3;
  
                public VertexData2(Mesh mesh, int index) {
  
                    var boneData = mesh.boneWeights[index];
                    blendWeightX = HalfHelper.SingleToHalf(boneData.weight0);
                    blendWeightY = HalfHelper.SingleToHalf(boneData.weight1);
                    blendWeightZ = HalfHelper.SingleToHalf(boneData.weight2);
                    blendWeightW = HalfHelper.SingleToHalf(boneData.weight3);
                  
                    blendIndex0 =  (ushort) boneData.boneIndex0;
                    blendIndex1 =  (ushort) boneData.boneIndex1;
                    blendIndex2 =  (ushort) boneData.boneIndex2;
                    blendIndex3 =  (ushort) boneData.boneIndex3;
                }
            }

And using them like this after mesh.SetVertexBufferParams:

                var vertices0 = new VertexData0[mesh.vertices.Length];
                for (int i = 0; i < vertices0.Length; i++) {
                    vertices0[i] = new VertexData0(mesh, i);
                }
                mesh.SetVertexBufferData(vertices0, 0, 0, vertices0.Length, 0);
               
               
                var vertices1 = new VertexData1[mesh.vertices.Length];
                for (int i = 0; i < vertices1.Length; i++) {
                    vertices1[i] = new VertexData1(mesh, i);
                }
                mesh.SetVertexBufferData(vertices1, 0, 0, vertices1.Length, 1);
               
               
                 var vertices2 = new VertexData2[mesh.vertices.Length];
                 for (int i = 0; i < vertices2.Length; i++) {
                     vertices2[i] = new VertexData2(mesh, i);
                 }
                mesh.SetVertexBufferData(vertices2, 0, 0, vertices2.Length, 2);

But I’m getting the exactly same results as my first post. The same deformation and the same error:
SkinnedMeshRenderer: Mesh has been changed to one which is not compatibile with the expected mesh data size and vertex stride. Aborting rendering.

Again, thank you for all your help. I would like to know what I’m doing wrong.

First: cool, I wasn’t aware unity had a half float implementation in its public namespace - I’m sure this one should work fine.

Second: if you set your W component of the position to 0.0 instead of 1.0 and u write custom shaders, make sure they read position as float3 and not as float4 (otherwise it would be a homogeneous direction rather than a homogeneous position). With default shaders this should be no issue though.

then, rather than relying on your code that ‘patches’ the current vertexdescriptors, rather define them as a completely new layout, so that you can be sure it matches your custom vertex attributes (mainly the correct order is not ensured in your current implementation).

Otherwise your code looks good, so I think your patched vertex description doesn’t fully match your data, so better create a new one instead of patching the existing one.

Not fully sure whether it’s required, but you might also need to add the following after setting everything else:

        SubMeshDescriptor mainMeshDescriptor = new SubMeshDescriptor();
        mainMeshDescriptor.baseVertex = 0;
        mainMeshDescriptor.bounds = meshBounds;
        mainMeshDescriptor.firstVertex = 0;
        mainMeshDescriptor.vertexCount = vertexCount;
        mainMeshDescriptor.indexStart = 0;
        mainMeshDescriptor.indexCount = indexCount;
        mainMeshDescriptor.topology = MeshTopology.Triangles;
        mesh.SetSubMesh(0, mainMeshDescriptor, MeshUpdateFlags.DontRecalculateBounds);
        mesh.UploadMeshData(false);

On a separate note, for your normal and tangent component, you should use VertexAttributeFormat.SNorm16 instead of Float16, since you’ll get more precision in the [-1,1] range at the same component size than you’ll get from Float16 (on your custom vertex side, make sure your normal and tangent vectors are normalized, then convert the float components to short by multiplying with 32767 and then casting to short).

For the UV components, if all your UV coordinates are constrained to the [0,1]x[0,1] square, you’d better use VertexAttributeFormat.UNorm16 instead of Float16 for the same reason (multiply your float UVs by 65535 and cast to ushort)

Blendweights afaik also are always between 0.0 and 1.0, so I bet you’ll get away without problems with a VertexAttributeFormat.UNorm8 even (just make sure when converting that the original float values really are clamped to [0.0,1.0], otherwise weird things will happen)

And since your mesh has less than 256 bones, for the bone indices you should be able to use UInt8 instead of UInt16

Again, thank you so much for your time, @mabulous ; I appreciate it.

So, I tried what you suggested, and while it improves, there are still problems.
I’m using a simple layout first to get it to work. I can experiment with different formats later.

First, I created an empty mesh and set the VertexAttributes.

            //Create Mesh base
            var mesh = new Mesh();
            mesh.subMeshCount = 1;
          
          
            //Set VertexAttribute Layout
            var layout = new[]
            {
                new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float16, 4, 0),
                new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float16, 4, 0),
                new VertexAttributeDescriptor(VertexAttribute.Tangent, VertexAttributeFormat.Float16, 4, 0),
                new VertexAttributeDescriptor(VertexAttribute.TexCoord0, VertexAttributeFormat.Float16, 2, 1),
                new VertexAttributeDescriptor(VertexAttribute.BlendWeight, VertexAttributeFormat.Float16, 4, 2),
                new VertexAttributeDescriptor(VertexAttribute.BlendIndices, VertexAttributeFormat.UInt16, 4, 2),
            };
          
            mesh.SetVertexBufferParams(original.vertexCount, layout);

I set the Buffer Data exactly as before.

If I stop here. I get this result:

There is no preview, no triangles, and no bones.
So, next, I use the SetIndexBufferParams and SetIndexBufferData.

            mesh.SetIndexBufferParams((int) original.GetIndexCount(0), IndexFormat.UInt16);
          
            var indexBuffer = new NativeArray<ushort>((int) original.GetIndexCount(0), Allocator.Temp);
            for (var i = 0; i < original.GetIndexCount(0); i++)
                indexBuffer[i] = (ushort) original.GetIndices(0)[i];
          
            mesh.SetIndexBufferData(indexBuffer, 0, 0, (int) original.GetIndexCount(0));

And, I have to also set the SubMeshDescriptor:

            //Create submesh
            var originalSubMesh = original.GetSubMesh(0);
            SubMeshDescriptor mainMeshDescriptor = new SubMeshDescriptor();
            mainMeshDescriptor.baseVertex = originalSubMesh.baseVertex;
            mainMeshDescriptor.bounds = originalSubMesh.bounds;
            mainMeshDescriptor.firstVertex = originalSubMesh.firstVertex;
            mainMeshDescriptor.vertexCount = originalSubMesh.vertexCount;
            mainMeshDescriptor.indexStart = originalSubMesh.indexStart;
            mainMeshDescriptor.indexCount = originalSubMesh.indexCount;
            mainMeshDescriptor.topology = MeshTopology.Triangles;
            mesh.SetSubMesh(0, mainMeshDescriptor, MeshUpdateFlags.DontRecalculateBounds);
            mesh.UploadMeshData(true);

And I got this result:


No preview, but triangles are ok. But sadly, no bones.
And surprisingly, Position, Normal, and Tangent are Float32 instead of Float16.

Having no bones gives me this result too:


The model is rotated. And that error on the inspector that says that it is missing the bones.

So, why does the Position, Normal, and Tangent are Float32… Well, after some test, it seems that removing this line fixes it:
mesh.UploadMeshData(false);


Still, the bones problem is there. But we get back the error:
SkinnedMeshRenderer: Mesh has been changed to one which is not compatible with the expected mesh data size and vertex stride. Aborting rendering.

Really weird…
Thank you so much for your time.

I just checked some of my code, and here’s the code which does something very similar as you are doing (I didn’t deal with bones though) and which works for me (note that I’m not creating a new mesh but instead I mesh.Clear() the original mesh. Not sure whether that makes a difference)

      mesh.Clear();
      VertexAttributeDescriptor[] layout = new[]
      {
        new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.SNorm16, 2, 0),
        new VertexAttributeDescriptor(VertexAttribute.TexCoord0, VertexAttributeFormat.UInt8, 4, 0),
        new VertexAttributeDescriptor(VertexAttribute.TexCoord1, VertexAttributeFormat.UNorm16, 2, 1)
      };
      mesh.SetIndexBufferParams(indices.Length, UnityEngine.Rendering.IndexFormat.UInt32);  // <-- I don't recall why, but I think for some reasons I had to do this before setting the vertex buffer params rather than after it.

      mesh.SetVertexBufferParams(compressedVertices.Count, layout);
      mesh.SetVertexBufferData(compressedVertices.ToArray(), 0, 0, compressedVertices.Count, 0, MeshUpdateFlags.DontValidateIndices);
      mesh.SetVertexBufferData(compressedUVs.ToArray(), 0, 0, compressedUVs.Count, 1, MeshUpdateFlags.DontValidateIndices);

      // Set proper index format based on number of vertices added.
      if (compressedVertices.Count > 65535)
      {
        mesh.SetIndexBufferData(indices, 0, 0, indices.Length, MeshUpdateFlags.DontValidateIndices);
      }
      else
      {
        mesh.SetIndexBufferParams(indices.Length, UnityEngine.Rendering.IndexFormat.UInt16);
        ushort[] indices16 = new ushort[indices.Length];
        for (int i = 0; i < indices.Length; i++)
        {
          indices16[i] = (ushort)indices[i];
        }
        mesh.SetIndexBufferData(indices16, 0, 0, indices.Length, MeshUpdateFlags.DontValidateIndices);
      }

      mesh.subMeshCount = 1;
      SubMeshDescriptor subMeshDescriptor = new SubMeshDescriptor();
      subMeshDescriptor.baseVertex = 0;
      subMeshDescriptor.bounds = meshBounds;
      subMeshDescriptor.indexCount = indices.Length;
      subMeshDescriptor.indexStart = 0;
      subMeshDescriptor.topology = MeshTopology.Triangles;

      mesh.SetSubMesh(0, subMeshDescriptor, MeshUpdateFlags.DontRecalculateBounds);

      mesh.name = mesh.name + "_compressed";
      mesh.bounds = meshBounds;
      mesh.UploadMeshData(false);

Thanks again!
I tried a few things based on your code, but the bones still aren’t correctly set.

Hopefully, someone else can give us some inside on this issue.

You should monitor this thread and maybe directly talk with this guy, I think he might be struggling with the same thing as you are. Assigning bone weights and indices directly to a vertex buffer.

a
lso, did you try whether things work if you only compress Position, normal, tangent and uv but leave bone indices and weights in the original format?

Thank you once again!! :slight_smile:

I tried what you suggested, compressing only position, normal, tanget and uv, and leaving bone data on original format. But the result is exactly the same. As if the bone data is not correctly set.

I’ll check that thread. Thanks!!

Been working on this too, thanks for sharing ideas and code.

#region MeshFormatUtils
   
    const MeshUpdateFlags DontUpdate = MeshUpdateFlags.DontRecalculateBounds | MeshUpdateFlags.DontResetBoneBounds | MeshUpdateFlags.DontValidateIndices;
    const MeshUpdateFlags DontUpdateOrNotify = DontUpdate | MeshUpdateFlags.DontNotifyMeshUsers;
    public static int GetIndexCount(Mesh.MeshData data)
    {
        int count = 0;
        for (int i = 0; i < data.subMeshCount; i++)
            count += data.GetSubMesh(i).indexCount;
        return count;
    }

    public static void CopyMeshData(Mesh.MeshDataArray sourceArray, Mesh.MeshDataArray destinationArray, VertexAttributeDescriptor[] attributes)
    {
        for (int i = 0; i < sourceArray.Length; i++)
        {
            Mesh.MeshData source = sourceArray[i];
            Mesh.MeshData output = destinationArray[i];

            VertexAttributeDescriptor[] newAttributes = new VertexAttributeDescriptor[attributes.Length];

            for (int index = 0; index < attributes.Length; index++) {
                var attribute = attributes[index];
                VertexAttributeDescriptor vertexAttributeDescriptor = attribute;
                //Debug.Log (vertexAttributeDescriptor.dimension);
                if (vertexAttributeDescriptor.attribute == VertexAttribute.Position || vertexAttributeDescriptor.attribute == VertexAttribute.Normal) {
                    vertexAttributeDescriptor.format = VertexAttributeFormat.Float16;
                    vertexAttributeDescriptor.dimension = 4;
                }
                newAttributes[index] = vertexAttributeDescriptor;
            }

            output.SetVertexBufferParams(source.vertexCount, newAttributes);
            output.SetIndexBufferParams(GetIndexCount(source), source.indexFormat);

            for (int s = 0; s < source.vertexBufferCount; s++) {
                /*NativeArray<float3> vertices = source.GetVertexData<float3> (s);
                output.GetVertexData<byte> (s).CopyFrom (vertices.Reinterpret<byte> (12));*/
               
                NativeArray<float3> vertices = source.GetVertexData<float3> (s);
                NativeArray<half4> compressedVertices = GetCompressedVertices (vertices);

                //Debug.Log ("expected " + output.GetVertexData<byte>(s).Length);
                //Debug.Log ("actual " +compressedVertices.Reinterpret<byte>(8).Length);
                output.GetVertexData<byte>(s).CopyFrom(compressedVertices.Reinterpret<byte>(8));
            }
               
            output.GetIndexData<byte>().CopyFrom(source.GetIndexData<byte>());
            output.subMeshCount = source.subMeshCount;
            for (int m = 0; m < source.subMeshCount; m++)
                output.SetSubMesh(m, source.GetSubMesh(m), DontUpdateOrNotify);
        }
    }

    private static NativeArray<half4> GetCompressedVertices (NativeArray<float3> vertices) {
        NativeArray<half4> tmp = new NativeArray<half4> (vertices.Length, Allocator.Temp);
        for (int index = 0; index < vertices.Length; index++) {
            tmp[index] = ToHalf4(vertices[index]);
        }
        return tmp;
    }

    public static void CopyTo(Mesh source, Mesh destination) {
        using var sourceArray = Mesh.AcquireReadOnlyMeshData(source);
        Mesh.MeshDataArray destinationArray = Mesh.AllocateWritableMeshData(1);
        CopyMeshData(sourceArray, destinationArray, source.GetVertexAttributes());
        Mesh.ApplyAndDisposeWritableMeshData(destinationArray, destination, DontUpdate);
    }

    public static half4 ToHalf4 (float3 h) {
        return new half4 ((half) h.x, (half) h.y, (half) h.z, half.zero);
    }
#endregion

ndlr; this code is higly unoptimized, especially memory-wise. Tested on a 6 millions vertices scene.

Using the CopyTo function on objects mesh to replace them is working (my meshes are only vertex and normal in float32 → converted to float16). There is no visible changes so that’s good.

The thing is it does nothing performance wise. Worst, recalculating bounds afterwards (static batching utility or raycasting using bounds) is not working due to mesh vertex format (waiting for float32).

Same behaviour happens on macOS and android.

I wonder if this optimization lead is a dead-end due to unity relying on float32 everywhere and doing optimization based on this assumption ?

1 Like

So you are seeing lower memory usage if you click on your mesh and look in the inspector?

Also, the major improvement/point of all this is improved memory usage. There will also possibly be a performance improvement on the GPU due to decreased memory pressure. But that is more dependent on a number of variables.

Before :
8003696--1029359--upload_2022-3-29_16-13-38.png
After :
8003696--1029365--upload_2022-3-29_16-14-43.png

Yeah I do see an improvement; Float323 → Float164 (2 because I have positions and normals).
Before with 889 vertices = 12
2889 = 21336 bytes
After with 889 vertices = 8
2*889 = 14224 bytes

Note that Float163 is not accepted by Unity 2020.3., so the added Float16 does nothing.

Performance rating was done using homemade FPS Meter; and it does not change anything.
I could have use snapdragon profiler or arm studio to look into GPU stress ; but if the FPS won’t drop as I’m GPU and vertex-bound, that means it probably didn’t change anything GPU-wise.

“Note that Float163 is not accepted by Unity 2020.3., so the added Float16 does nothing” What do you mean?

It does throw an error if you set the vertex attribute descriptor dimension value to 3 (which is expected as position is a float3) :
ArgumentException: Invalid vertex attribute format+dimension value (Float16 x 3, data size must be multiple of 4)

Oh that’s just because GPUs often expect data to be X byte aligned, where X is usually 4 bytes.

1 Like

Bump.
Having the same issue.
It seems to be a bug with SkinnedMeshRenderer.

Hey man, sorrry to interrupt you because long time passed since this thread begins. I run into the same problem with you, when I invoke mesh.UploadMeshData(false); or mesh.UploadMeshData(true);, the rendering is ok but mesh vertex format is float32 instead of float 16. However, when I remove mesh.UploadMeshData(); like you did, the rendering is abnormally but vertex format is float16. This is really weird and trouble me. Have you solved this? Or give some information about this. Thank you!