I can’t seem to find working examples of how to use ComputeShader.SetConstantBuffer. An attempt was made in this thread , but I’m afraid no one saw it because it was posted in a different forum.
Well, I can’t get it to work either. Testing in Unity 2020.3.
Am I doing it wrong?
In the example below I get “Result is: 0. Should be: 3”.
using UnityEngine;
using System.Runtime.InteropServices;
public class ConstantBufferTest : MonoBehaviour
{
[SerializeField] ComputeShader _cs;
struct OnceConstants
{
public int value1;
public int value2;
}
void Start()
{
int executeKernel = _cs.FindKernel( "Execute" );
// Create and set buffers.
int onceSize = Marshal.SizeOf( typeof( OnceConstants ) );
var onceBuffer = new ComputeBuffer( 1, onceSize, ComputeBufferType.Constant );
var resultBuffer = new ComputeBuffer( 1, sizeof( int ) );
_cs.SetBuffer( executeKernel, "_ResultBuffer", resultBuffer );
_cs.SetConstantBuffer( "Once", onceBuffer, offset: 0, onceSize );
// Upload constants.
onceBuffer.SetData(
new OnceConstants[]{
new OnceConstants() {
value1 = 1,
value2 = 2,
}
}
);
// Execute shader.
_cs.Dispatch( executeKernel, 1, 1, 1 );
// Read back and log.
int[] resultData = new int[ 1 ];
resultBuffer.GetData( resultData );
Debug.Log( "Result is: " + resultData[0] + ". Should be: " + ( 1 + 2 ) );
// Clean up after the party.
onceBuffer.Release();
resultBuffer.Release();
}
}
And the ComputeShader:
#include "UnityCG.cginc"
#pragma kernel Execute
RWStructuredBuffer<int> _ResultBuffer;
CBUFFER_START( Once )
int value1;
int value2;
CBUFFER_END
[numthreads(1,1,1)]
void Execute()
{
_ResultBuffer[ 0 ] = value1 + value2;
//_ResultBuffer[ 0 ] = 1 + 2; // This works
}
Thank you for the compiled shader @grizzly . It is identical with the one I get in 2021.1.28f1. But it does not work.
I have updated my graphics driver (RTX2080). Updated Windows. Restarted. Reinstalled Unity. Clean project. No packages, no nothing. I am really out of ideas at this point.
EDIT:
I also tried using Visual Studio Code instead of Visual Studio Editor.
That is strange. Tested here on an old Win 7 Dx11 1060 machine using Unity 2021.1.3f1. Previous versions have also worked for me, so I would suggest filing a bug
Testing the script above in the newly published Unity 2021.2.17f1 on Windows, Editor runtime.
Target platform: “Windows, Mac, Linux”.
Graphics API for “Windows, Mac, Linux”: DX11.
Not working.
Target platform: “Windows, Mac, Linux”.
Graphics API for “Windows, Mac, Linux”: Vulcan.
Not working.
Target platform:“Android”.
Graphics API for “Windows, Mac, Linux”: DX11 (the Editor).
Graphics API for “Android”: OpenGLES3.
Not working.
Target platform:“Android”.
Graphics API for “Windows, Mac, Linux”: DX11 (the Editor).
Graphics API for “Android”: Vulcan.
Not working and sometimes crashing.
Target platform:“Android”.
Graphics API for “Windows, Mac, Linux”: Vulcan (the Editor). Note: Oculus XR requires DX11.
Graphics API for “Android”: Vulcan.
Not working.
Same here, does not work with Unity 2021.2.10. Constant Buffer values are just 0.
Cbuffers work in a standalone application outside of Unity though.
Is there an alternative solution to set a block of data in a compute shader? In openGL for example you can use structs as uniforms and upload the struct data. Is this also possible with compute shaders in d3d11 or only possible with cbuffers?
Yep, same here. Constant buffers in compute shaders do not seem to work. Tried declaring them with cbuffer as well as the CBUFFER_START/CBUFFER_END macros, but their data is always zero. (Unity 2021.3.0f1, macOS)
Update: I got this case to work in 2021LTS by calling SetConstantBuffer(“ConstantBuffer”,constantBuffer,0,sizeof(int)*4).
My compute buffer itself is declared with the same count and stride.
I think you need to bind a multiple of 16bytes and require padding for 2 ints, likely because of this : Packing rules for constant variables - Win32 apps | Microsoft Learn
Hope this helps people trying to make this works.
For the people at Unity: Rather than the fix workaround apparently in 2022.2.0a14, wouldn’t it be better to check that the size parameter is a multiple of 16 and give a warning or error if it isn’t and/or pad the data automatically (might not be desirable I’m sure you need have a lot of additional constraints on your side)
I just returned to the issue and it seems my original example works in 2021.3.37f1 if I change just Unity’s CBUFFER_START CBUFFER_END constant buffer declaration macro to plain cbuffer {} declaration, like in the example by @Przemyslaw_Zaworski .
I also noted that 16bit alignment is not necessary on the MacOS platform, if you are only targeting that. I didn’t test the performance difference though.