I can’t seem to get Buffers to work in WebGL 2.0. I know compute shaders and StructuredBuffers aren’t supported in WebGL 2.0, but regular buffers seem like they should work.
I have a shader that procedurally places cubes with the following code:
cubeMat.SetBuffer("positionBuffer", positionsBuffer);
Graphics.DrawMeshInstancedProcedural(cube, 0, cubeMat, bounds, CubeCount);
The shader code is as follows
Shader "ProceduralShader" {
Properties{
_Size("Size", Float) = 1
}
SubShader{
Pass {
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
float _Size;
Buffer<float3> positionBuffer;
struct v2f
{
float4 pos : SV_POSITION;
float3 basePos :TEXCOORD0;
};
v2f vert(appdata_base v, uint instanceID : SV_InstanceID)
{
float3 data = positionBuffer[instanceID];
float3 worldPosition = data + v.vertex.xyz * _Size;
v2f o;
o.pos = mul(UNITY_MATRIX_VP, float4(worldPosition, 1.0f));
o.basePos = v.vertex.xzy;
return o;
}
fixed4 frag(v2f i) : SV_Target
{
return float4(i.basePos, 1);
}
ENDCG
}
}
}
In the Unity Editor, the system works as expected:
However, the shader fails in a webGL player:
The browser console (in both Chrome and Edge on a Windows PC) outputs the following error:
ERROR: 0:2: 'GL_EXT_texture_buffer' : extension is not supported
ERROR: 0:20: 'samplerBuffer' : Illegal use of reserved word
ERROR: 0:20: 'samplerBuffer' : syntax error
This is all very mysterious to me. Any ideas what I’m doing wrong?
My project is available here if you’d like to reproduce the issue: