New Graphics.DrawProcedural does not work

Hello,
in my project I’m creating procedural meshes using ComputeShader and the vertex data is stored in ComputeBuffer on the GPU. I need to render the mesh somehow.

I’ve searched for some examples how to correctly make a render directly on GPU without using MeshRenderer component. I’ve found this GitHub - keijiro/NoiseBall3: A Unity example that shows how to use the new implementation of DrawProcedural.
There is said that starting form Unity 2019.1 Graphics.DrawProcedural was improved and allows to make a draw calls inside standard render pipeline which supports lights and shadows.
So I wrote a simple shader just to make the mesh visible, assigned it to the material and put the Graphics.DrawProcedural call inside a function which is called every Update of my terrain chunk:

public void DrawMesh()
{
    m_meshMaterial.SetBuffer("_MeshBuffer", m_meshBuffer);
    Graphics.DrawProcedural(m_meshMaterial,
                            new Bounds(Vector3.zero, Vector3.one * 500),
                            MeshTopology.Triangles,  m_maxVertices, 1, null, null,
                            UnityEngine.Rendering.ShadowCastingMode.TwoSided, true,
                            0);
}

m_meshBuffer contains vertices with WorldSpace coords previously calculated by ComputeShader.

m_meshMaterial is assigned with this shader:

Shader "Materials/World/MyDrawMesh"
{
    SubShader
    {
        Cull off
        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"

            struct Vertex
            {
                float4 position;
                float3 normal;
            };
          
            StructuredBuffer<Vertex> _MeshBuffer;

            struct v2f
            {
                float4 vertex : SV_POSITION;
                float3 color : COLOR;
              
            };

            v2f vert (uint id : SV_VertexID)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(float4(_MeshBuffer[id].position.xyz,
                                                       1.0f));
                o.color = dot(float3(0, 1, 0), _MeshBuffer[id].normal) * 0.5 + 0.5;
                return o;
            }

            fixed4 frag(v2f i) : SV_Target
            {
                return float4(i.color, 1.0f);
            }
            ENDCG
        }
    }
}

Unfortunately this approach draws nothing. However I do get Draw calls in the render pipeline:

Now if i use the old implementation of Graphics.DrawProcedural (Graphics.DrawProceduralNow) called in OnPostRender method of my main camera using the same shader i get the results.

private void OnPostRender()
{
    m_meshMaterial.SetBuffer("_MeshBuffer", m_meshBuffer);
    m_meshMaterial.SetPass(0);
    Graphics.DrawProceduralNow(MeshTopology.Triangles,
    m_maxVertices);
}

But this is not the approach I want because the mesh is being rendered outside the render pipeline hence missing lights and shadows.

How can I render mesh using the new DrawProcedural method? Author of the NoiseBall example uses Surface Shader but during compilation it is converted into standard vertex/fragment shader. Maybe my shader misses some pragmas or tags. I’m new to shaders so it wouldn’t be surprising.

Ok. After several attempts i found the solution.
What happens is when I call m_meshMaterial.SetBuffer(“_MeshBuffer”, m_meshBuffer); I set the buffer in a material shared by all draw calls. And my last terrain chunk which sets the buffer is empty (contains no vertices). Therefore all draws share this data and draws nothing. The key is to use MaterialPropertyBlock:

public void DrawMesh()
            {
                MaterialPropertyBlock block = new MaterialPropertyBlock();
                block.SetBuffer("_MeshBuffer", m_meshBuffer);
                Graphics.DrawProcedural(m_meshMaterial, new Bounds(Vector3.zero, Vector3.one * 500), MeshTopology.Triangles, m_maxVertices, 1, null, block);
            }

This allows to set specific material properties (in my case vertices buffer) for individual draw calls.

6 Likes

Hi, when providing a shader a ComputeBuffer, i faced a problem like yours, Only the last object is drawn, but DrawProceduralNow don’t solved it, also the MaterialPropertyBlock yields the same result. I Looked everywhere and can’t find a solution.
Using a Copy of the Material ( using new Material(original) ) for each object also does not solved it.

Hope that someone passing by know how to solve this.

Here is the drawCall

            MaterialPropertyBlock propertyBlock = new MaterialPropertyBlock();
            propertyBlock.SetBuffer("_verts3d", buffer_tris3d);

            Graphics.DrawProcedural(
                 _material,
                 bounds,
                 MeshTopology.Triangles, tris3d_count, 1, null, propertyBlock,
                 ShadowCastingMode.On, true, 0
             );

And my Shader

Shader "Custom/slice4DSurf"
{
    Properties
    {
        _Color ("Color", Color) = (1,1,1,1)
        _MainTex ("Albedo (RGB)", 2D) = "white" {}
        _Glossiness ("Smoothness", Range(0,1)) = 0.5
        _Metallic ("Metallic", Range(0,1)) = 0.0
    }
    SubShader
    {
        Cull off
        Tags { "RenderType"="Opaque" }
        LOD 200

        CGPROGRAM
        // Physically based Standard lighting model, and enable shadows on all light types
        #pragma surface surf Standard vertex:vert addshadow

        // Use shader model 3.0 target, to get nicer looking lighting
        #pragma target 5.0


        struct Vertex3d
        {
            float3 p;
            float3 n;
        };

#ifdef SHADER_API_D3D11
        StructuredBuffer<Vertex3d> _verts3d;
#endif

        sampler2D _MainTex;

        struct Input
        {
            float2 uv_MainTex;
        };

        struct mydata_full
        {
            float4 vertex    : POSITION;  // The vertex position in model space.
            float3 normal    : NORMAL;    // The vertex normal in model space.
            float4 texcoord  : TEXCOORD0; // The first UV coordinate.
            float4 texcoord1 : TEXCOORD1; // The second UV coordinate.
            float4 texcoord2 : TEXCOORD2; // The second UV coordinate.
            float4 tangent   : TANGENT;   // The tangent vector in Model Space (used for normal mapping).
            float4 color     : COLOR;     // Per-vertex color
            uint id : SV_VertexID;
        };

        half _Glossiness;
        half _Metallic;
        fixed4 _Color;

        // Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
        // See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
        // #pragma instancing_options assumeuniformscaling
        UNITY_INSTANCING_BUFFER_START(Props)
            // put more per-instance properties here

        UNITY_INSTANCING_BUFFER_END(Props)



        void vert(inout mydata_full v) {
            #ifdef SHADER_API_D3D11
            v.vertex = float4(_verts3d[v.id].p,1.0f);
            v.normal = _verts3d[v.id].n;
            #endif
            //v.vertex.xyz += v.normal * 0.1f;
        }

        void surf (Input IN, inout SurfaceOutputStandard o)
        {
            // Albedo comes from a texture tinted by color
            fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
            o.Albedo = c.rgb;
            // Metallic and smoothness come from slider variables
            o.Metallic = _Metallic;
            o.Smoothness = _Glossiness;
            o.Alpha = c.a;
        }
        ENDCG
    }
    FallBack "Diffuse"
}
1 Like

Posting here my solution, because there is little about that in the internet.

The problem wasn’t in the shader, neither in the drawProcedural call, the problem was in the process of filling out the ComputeBuffer with a ComputeShader, i was using the same computeshader for each object, but the buffers were set only in initialization, which make the last buffer the only set in the shader. Solved this by calling setBuffer each time before the dispatch of the ComputeShader.

Also if anyone is interested in how to use a Counter without CPU reading of buffers, I used ComputeBuffer.CopyCount to copy the count of the AppendBuffer to a IndirectArgument buffer, and put it in drawProceduralIndirect function. Now i have more than 60 FPS in a scene with realtime GPU generated geometry.

1 Like