SetVertexBufferParams and separate arrays for Vertices, Normals, UVs, etc.

I have my mesh layed out in memory like so:

Vector3[ ] Positions;

Vector3[ ] Normals;

Vector2[ ] UVs,

And indices, obviously.
Can I somehow feed this data to a Mesh class and render my mesh WITHOUT reorganizing data in memory?

Documentation suggests I MUST create an array of structs, each holding
Vector3 Position;
Vector3 Normals;
Vector2 UVs;

Which would mean copying data in memory, which I am trying to avoid.
Is there another way?

You could use the legacy API and assign them to Mesh.vertices, Mesh. normals and Mesh.uv.

However, the data will most likely be copied internally because all graphic APIs expect the data to be an array of structs. You can override that by using multiple vertex buffers internally (that’s how UE does it with their VertexFactory) but as far as I am aware Unity uses the classic approach with one vertex buffer per mesh.

@c0d3_m0nk3y this is very important to me: so are you sure that all graphic APIs expect and array of structs and there is no other way? Where can I read more about it AND about the approach UE uses?

I didn’t say there was no other way. You can bind multiple vertex buffers to simulate SOAs if you have full control of the engine. There is a way to interleave multiple vertex buffers. This is usually used for instancing. One vertex buffer has the instance positions and the other one has the vertex positions. Usually you’d configure it such that for every new instance you increase one element in the instance buffer but you can just step through them at the same rate to simulate SOAs.

See InstanceDataStepRate and InputSlot

(Disclaimer: I haven’t tried this personally)

This is a PIX capture from Unreal Engine. As you can see, it is binding 4 vertex buffers.
9154772--1273040--upload_2023-7-18_9-41-26.png

You find a lot of debates whether that is actually increasing performance but it does allow you to ignore certain attributes if you don’t need them (like in the depth pre-pass).

You can find UE’s VertexFactory implementation here (have to sign up for github access first)
https://github.com/EpicGames/UnrealEngine/blob/463443057fb97f1af0d2951705324ce8818d2a55/Engine/Source/Runtime/RenderCore/Private/VertexFactory.cpp#L22

See also

1 Like

@c0d3_m0nk3y thanks for a comprehensive reply, it was very helpful.

1 Like

Each vertex buffer or stream, is interleaved with the attributes set for that stream on the mesh object. That’s why it makes sense to use a struct, because the layout is the same, but you could manually write bit by bit to the vertex buffer if you know the right stride.

Using the simple Mesh api for .vertices, .normals, etc automatically write the interleaved data for you. The advanced mesh api and MeshData api allow you to write to those buffers directly and if you are aware of your stride, you could write directly to the correct indices. If you really want each of those attributes to be contiguous, then you could set them to separate streams using MeshData

But, if the original question is about your data in a certain layout to copy to a mesh, why isn’t it already written and, presumably, serialized as a mesh object?

@kenamis because we are receiving data from a different program in a form of a number of separate arrays in an unmanaged memory. Thanks for suggesting MeshData! Do they work with NativeArray-s?

At runtime though? Yes, MeshData api works with NativeArrays
https://docs.unity3d.com/2020.1/Documentation/ScriptReference/Mesh.MeshData.GetVertexData.html

@kenamis hey, I saw the ability to set the stream index, but I can’t quite find a way to specify to Unity which stream should be used as normal and which - as positions (for instance). Here is how I do it:

                            if (meshToCreate.VerticesLength > 0)
                            {
                                NativeArray<Vector4> vertices = ParseData<Vector4>((void*)meshToCreate.Vertices, meshToCreate.VerticesLength);

                                MeshToRender.SetVertexBufferParams(vertices.Length, new VertexAttributeDescriptor(VertexAttribute.Position,
                                    VertexAttributeFormat.Float32, 4, stream: 0));

                                MeshToRender.SetVertexBufferData(vertices, 0, 0, vertices.Length, stream: 0,
                                    MeshUpdateFlags.DontValidateIndices);
                            }

                            if (meshToCreate.NormalsLength > 0)
                            {
                                NativeArray<Vector4> normals = ParseData<Vector4>((void*)meshToCreate.Normals, meshToCreate.NormalsLength);

                                MeshToRender.SetVertexBufferParams(normals.Length, new VertexAttributeDescriptor(VertexAttribute.Normal,
                                                VertexAttributeFormat.Float32, 4, stream:1));

                                MeshToRender.SetVertexBufferData(data:normals, 0, 0, normals.Length, stream: 1, MeshUpdateFlags.DontValidateIndices);
                            }

Sadly with this code the mesh won’t render at all.

IIRC you need to set all the attributes in 1 function call. For example

MeshToRender.SetVertexBufferParams(vertices.Length,
  new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float32, 4, stream: 0),
  new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float32, 4, stream:1));
2 Likes

@kdchabuk wow that actually worked! Thanks!

1 Like