RenderMeshRenderV2 not Working

Hello,

Updated to 2019.3 and having some problems seeing my mesh render. I can see it in RenderMeshRenderV2 but is just not rendering, any idea. Even when I scale my mesh and set receive shadow I can see the components update


What material / shader / render pipeline?

Im using :
Unity 2019.3
Hybrit Renderer 0.3.1 preview10
Entities 0.4.0 preview10

I have an empty gameObject with Material and Mesh Component

My conversion

GameObjectConversionSystem .

...

Entities.ForEach((PlanetMeshProxy data) =>
{

    Entity ePrimary = GetPrimaryEntity(data);
 
    DstEntityManager.AddComponentData(ePrimary, new TagHexMeshComponent { });

    var renderMesh = new RenderMesh
    {
        mesh = data.mesh,
        material = data.material
    };
    DstEntityManager.AddSharedComponentData(ePrimary, renderMesh);

...

And a system that procedurally create a mesh and does SetSharedComponentData

     protected override JobHandle OnUpdate(JobHandle inputDeps)
        {
            // TODO : JobScheduling !! currently all single and loads of sync point
            //
            // Get Planet Data 
            var subdivisions = new NativeArray<int>(1, Allocator.TempJob);
            var radius = new NativeArray<float>(1, Allocator.TempJob);
            var vertexCount = new NativeArray<int>(1, Allocator.TempJob);
            var triangleCount = new NativeArray<int>(1, Allocator.TempJob);

            inputDeps = new GetPlanetSizeData
            {
                SubdivisionCount = subdivisions,
                Radius = radius,
                VertexCount = vertexCount,
                TriangleCount = triangleCount
            }.ScheduleSingle(_qPlanetSize, inputDeps);

            inputDeps.Complete();

            //
            // Get Mesh Positions
            var meshTrianglePositions = new NativeArray<float3>(triangleCount[0] * 3, Allocator.TempJob);

            inputDeps = new GetMeshTrianglePositions
            {
                MeshTrianglePositions = meshTrianglePositions,
                TileBuffer = GetBufferFromEntity<TilesBufferComponent>(true)
            }.ScheduleSingle(_qTrianglesBuffer, inputDeps);

            //
            // Get Mesh Normals
            var meshTriangleNormals = new NativeArray<float3>(triangleCount[0] * 3, Allocator.TempJob);

            inputDeps = new GetMeshTriangleNormals
            {
                MeshTriangleNormals = meshTriangleNormals,
                TileBuffer = GetBufferFromEntity<TilesBufferComponent>(true)
            }.ScheduleSingle(_qTrianglesBuffer, inputDeps);

            //
            // Get Mesh Tangents
            var meshTriangleTangents = new NativeArray<float4>(triangleCount[0] * 3, Allocator.TempJob);

            inputDeps = new GetMeshTriangleTangent
            {
                MeshTriangleTangents = meshTriangleTangents,
                TileBuffer = GetBufferFromEntity<TilesBufferComponent>(true)
            }.ScheduleSingle(_qTrianglesBuffer, inputDeps);

            //
            // Get Mesh UVs
            var meshTriangleUVs = new NativeArray<float2>(triangleCount[0] * 3, Allocator.TempJob);

            inputDeps = new GetMeshTriangleUv
            {
                MeshTriangleUVs = meshTriangleUVs,
                TileBuffer = GetBufferFromEntity<TilesBufferComponent>(true)
            }.ScheduleSingle(_qTrianglesBuffer, inputDeps);

            //
            // Get Mesh Indices 
            var meshTriangleIndices = new NativeArray<int>(triangleCount[0] * 3, Allocator.TempJob);

            inputDeps = new GetMeshTriangleIndices
            {
                MeshTriangleIndex = meshTriangleIndices
            }.ScheduleSingle(_qTrianglesBuffer, inputDeps);


            //
            // Set 
            inputDeps.Complete();
            
            // TODO : Hack
            if (_qTrianglesBuffer.CalculateEntityCount() > 0)
            {
                // Warning : If we go over 6 sub we need to use 32bin on mesh indexing
                _hexMesh.indexFormat = IndexFormat.UInt32;
                _hexMesh.name = "HexSphere";
                _hexMesh.SetVertices(meshTrianglePositions);
                _hexMesh.SetNormals(meshTriangleNormals); 
                _hexMesh.SetTangents(meshTriangleTangents);
                _hexMesh.SetUVs(0, meshTriangleUVs );
                
                var meshTriangleColors = new NativeArray<Color32>(triangleCount[0] * 3, Allocator.TempJob);
                _hexMesh.SetColors(meshTriangleColors);
                meshTriangleColors.Dispose();
                
                
                EntityQuery qHexMesh = GetEntityQuery
                (
                    ComponentType.ReadOnly<TagHexMeshComponent>()
                );
                Entity eHexMesh = qHexMesh.GetSingletonEntity();
                var m = EntityManager.GetSharedComponentData<RenderMesh>(eHexMesh);
                EntityManager.SetSharedComponentData(eHexMesh, new RenderMesh
                {
                    material = m.material,
                    mesh = _hexMesh,
                    castShadows = m.castShadows,
                    receiveShadows = true  //m.receiveShadows
                });
            }

            //
            // Cleanup
            meshTrianglePositions.Dispose();
            meshTriangleNormals.Dispose();
            meshTriangleTangents.Dispose();
            meshTriangleUVs.Dispose();
            meshTriangleIndices.Dispose();
            subdivisions.Dispose();
            radius.Dispose();
            vertexCount.Dispose();
            triangleCount.Dispose();

            return inputDeps;
        }

And looks like its in the RenderMeshRenderV2 but no render, any ideas ?

Wait what is VertexAttributeDescriptor and do I need it with the new mesh API ?

You need to set that if you use SetVertexBufferData and that does not seem to be the case by looking at your example.

If you can post an example project of what you are trying to do I can have a look in the next few days.

Ive made a mess trying to get this to work, do you have a simple example just creating a triangle. Maybe I can work my way back from that.

Ok I think I found the problem, mesh.SetTriangels does not take a NativeAttay … why is this?

You have to use SetIndexBufferData instead when you setting data using native containers.
Note that before setting index buffer data you must set params for it using
SetIndexBufferParams.
If you still unable to do generate the mesh using it I will post a sample code once I’m at the computer.

Here is an example of how you create a mesh using the new Mesh api.
This example can be leverage to use more properties but right now it only sets vertex position and normals and has only a single submesh.

using Unity.Collections;
using Unity.Mathematics;
using UnityEngine;
using UnityEngine.Rendering;

public class GenerateMeshUtility {

  public struct VertexData {
    public float3 Position;
    public float3 Normal;
  }

  public struct VertexIndexData {
    public int Value;
    public static implicit operator VertexIndexData(int v) { return new VertexIndexData { Value = v }; }
    public static implicit operator int(VertexIndexData v) { return v.Value; }
  }

  public static Mesh CreateMeshFrom(Mesh mesh) {
    var vertexData = new NativeArray<VertexData>(mesh.vertexCount, Allocator.Temp);
    for (var i = 0; i < mesh.vertexCount; ++i)
      vertexData[i] = new VertexData {
        Position = mesh.vertices[i],
        Normal = mesh.normals[i]
      };

    var triangles = mesh.triangles;
    var vertexIndexData = new NativeArray<VertexIndexData>(mesh.triangles.Length, Allocator.Temp);
    for (var i = 0; i < triangles.Length; ++i)
      vertexIndexData[i] = triangles[i];

    return CreateMesh(vertexData, vertexIndexData);
  }

  public static Mesh CreateMesh(NativeArray<VertexData> vertexArray, NativeArray<VertexIndexData> vertexIndexArray) {
    var mesh = new Mesh();

    // This descriptor must match VertexData structure.
    // Add more data if you need to (eg. colors, tangents, etc)
    var attributeDescritpor = new VertexAttributeDescriptor[] {
        new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float32, 3, 0),
        new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float32, 3, 0)
      };

    var vertexCount = vertexArray.Length;
    // Tell the mesh what format the params will be sent
    mesh.SetVertexBufferParams(vertexCount, attributeDescritpor);
    mesh.SetVertexBufferData(vertexArray, 0, 0, vertexCount, 0);

    // The format must match VertexIndexData struct
    // You could supply a native array of ints here.
    // Using a struct instead will let you use it in dynamicBuffers
    mesh.SetIndexBufferParams(vertexIndexArray.Length, IndexFormat.UInt32);
    mesh.SetIndexBufferData(vertexIndexArray, 0, 0, vertexIndexArray.Length);

    // This is now static to a single submesh must you can leverage this part to your use case
    mesh.subMeshCount = 1;
    var descr = new SubMeshDescriptor() {
      baseVertex = 0,
      bounds = default,
      indexCount = vertexIndexArray.Length,
      indexStart = 0,
      topology = MeshTopology.Triangles
    };
    mesh.SetSubMesh(0, descr);

    // If we want the correct bounds we must do this
    mesh.RecalculateBounds();
    return mesh;
  }
}

This allows you to create a mesh using the new api from another mesh or create a mesh from vertex data and vertex index data supplies via native arrays.

3 Likes

thanks heaps for the example. Digging into it now

Your example was super helpful cant thank you enough. Got the mesh up and running with the exception of UVs, I cant see any obvious reason it not working but when I run Vector2[ ] t = _hexMesh.uv its all zero for some reason. Tracking the data up it appear that the data loss is at SetVertexBufferData or maybe UV need something special.

public struct VertexData
{
    public float3 Position;
    public float3 Normal;
    public float4 Tangent;
    public float2 UVs;
    public int4 Color; 
};

...


var vertexAttrDescriptor = new VertexAttributeDescriptor[]
{
    new VertexAttributeDescriptor(VertexAttribute.Position, VertexAttributeFormat.Float32, 3, 0),
    new VertexAttributeDescriptor(VertexAttribute.Normal, VertexAttributeFormat.Float32, 3,0),
    new VertexAttributeDescriptor(VertexAttribute.Tangent, VertexAttributeFormat.Float32, 4,0),
    new VertexAttributeDescriptor(VertexAttribute.TexCoord0, VertexAttributeFormat.Float32, 2,0),
    new VertexAttributeDescriptor(VertexAttribute.Color, VertexAttributeFormat.UInt32, 4,0)
};


...


_hexMesh.SetVertexBufferParams(vertexArray.Length, vertexAttrDescriptor);
_hexMesh.SetVertexBufferData(vertexArray, 0, 0, vertexArray.Length, 0);

_hexMesh.SetIndexBufferParams(vertexIndexData.Length, IndexFormat.UInt32);
_hexMesh.SetIndexBufferData(vertexIndexData, 0, 0,vertexIndexData.Length);

_hexMesh.subMeshCount = 1;
var descr = new SubMeshDescriptor() {
    baseVertex = 0,
    bounds = default,
    indexCount = vertexArray.Length,
    indexStart = 0,
    topology = MeshTopology.Triangles
};
_hexMesh.SetSubMesh(0, descr);
_hexMesh.RecalculateBounds();

// BUT
Vector2[] t = _hexMesh.uv; // all zero for some reason

How are you creating your vertex array? I don’t have any problem with UVs

Seems like you have to define Color before the UV. Actually the order is this:
Quoted from Unity Docs

Interesting, thanks again. I’ll give this another try tonight

Yup, this was the problem , got it all working

https://www.youtube.com/watch?v=Xyr_rFcEAoE

2 Likes

I dont understand the term sub-mesh, is it equivalent to maya mesh shells ?