How to sample theTexture2D<uint2> in Web

I am trying to use a custom shader to perform the fullscreen blit in URP in Web.
I have uint texture, which is defined like that:

Texture2D<uint2> _CameraStencil;

and exactly this line of shader code:

uint st = LOAD_TEXTURE2D(_CameraStencil, int2(floor(input.texcoord.x * _CameraColorTexture_TexelSize.z), floor(input.texcoord.y * _CameraColorTexture_TexelSize.w))).STENCIL_CHANNEL;

generates this browser error:
GL_INVALID_OPERATION: Mismatch between texture format and sampler type (signed/unsigned/float/shadow).
and makes the screen stay black.

How does that even compile? Texture2D is not a generic type.

This shader code compiles and works in Editor and also on Android and Windows, but not in WebGL.

Oh that Texture2D is within the shader? I thought that was your C# definition.

I see it now! :slight_smile:

Texture2D is defined as uint2 (unsigned) but in the macro you use int2 (signed). Therefore: mismatch.

Thank you, but you don’t know how to fix it, do you?

This is an obstacle for webgl only.

I have also tried this, but result is the same:

input.texcoord.x * _CameraColorTexture_TexelSize.z
), floor(input.texcoord.y * _CameraColorTexture_TexelSize.w), 0)).STENCIL_CHANNEL;```

Does this work?

uint st = LOAD_TEXTURE2D(_CameraStencil, uint2(floor(input.texcoord.x * _CameraColorTexture_TexelSize.z), floor(input.texcoord.y * _CameraColorTexture_TexelSize.w))).STENCIL_CHANNEL;

Only changed it from int2 to uint2.
You may also want to try the other way round, declare it as Texture2D

Thanks again.
I have tried various int/uint options and it always keeps on working in editor, but not in web.