I’m passing an array of bytes into a compute shader buffer. Shaders can’t use bytes, as far as I can tell, so I’m treating this as an array of unsigned integers in the shader, then finding the bytes within the 32-bit integers when I need to. But that’s besides the point, my problem arrises early in the script: I try to pass in an array that’s fifty zeros, then fifty threes. I’ve double checked that this is correct on the CPU C# side of the exchange. However, when it arrives in the shader, I get:
0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 11000000110000000000000000, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011, 11000000110000001100000011
That’s 48 bytes worth of zeros, then two threes, then two zeros, then 48 threes. What in the world could be causing this? Any help is appreciated.