Can't pass an integer to a shader

I am passing my generated RGBA32 texture with Point filter mode to my shader, packing 8 bits per color (32 bytes total), which don’t seem to be passing accurately. For instance, if I send 1 bit as 00000001 in a red cannel like new Color32(1, 0, 0, 0), when I get my color tex2D(_MainTex, uv0) * 255 doesn’t result in 1. Same issue with packing any other bit. Also, they seem to appear randomly. Almost every bit results in the first one set to 1, which is totally unexpected. I get, that every 0-255 byte gets mapped to 0-1 float, but I can’t see how there could be any info loss. Is there any workaround to get exact color values from texture?

1 Like

It seems like the issue is that I can’t seem to pass an integer into a shader property, that is used for masking :c
The texture channels seem fine

Make sure the texture is set to be using linear color space and not sRGB when you’re creating it. After that floating point math can be funny, so I’d recommend rounding rather than flooring a value when getting an integer.

int4 values = (int)(tex2D(_MainTex, uv) * 255.0 + 0.5);

Awesome, thank you! But it still seems like I can’t pass an integer as a color to have a mask and check it

// in c#
mat.SetInt("_MyInt", myInt);
// technically just casts the integer to a float, is equivalent to
mat.SetFloat("_MyInt", (float)myInt);

// shader property 
_MyInt ("My Integer", Int) = 1.0
// in the inspector is treated the same as a float and doesn't force integer values

// shader uniform defined as uint, native c++ casts the float passed around in c# to an integer
uint _MyInt;
1 Like

Awesome, thank you! Seems to be working now