We do have Material.SetFloatArray(), but we do not have Material.SetIntArray(). We have Material.SetVector(), but we don’t have Material.SerVectorInt().
Also we have ComputeShader.SetBool() and ComputeShader.SetBools(), but we don’t have Material.SetBool()
A more complete API would be so nice.
I recently wrote a shader that needed Vector4Int as parameter and I had to make 4 separate int parameters for it. As Obi Wan Kenobi would say - “So uncivilized”
Unity’s Material class does not actually support ints or bools. All values are internally stored as floats.
That means Material.SetInt() is internally implemented as:
void SetInt(int id, int intValue)
{
SetFloat(id, (float)intValue));
}
Similarly Material.GetInt() is implemented as:
int GetInt(int id)
{
return (int)GetFloat(id);
}
If a shader is using an int or bool uniform, the internal rendering code converts the floats to the appropriate format when sending it to the GPU. Similarly, Unity’s Material class only supports four component float vectors, but will convert the values to the appropriate format and vector size when sending it to the GPU.
Yes. And as I said, Unity will convert the floats to the appropriate format when it passes them along to the GPU. It does mean that integers are limited to the precision of a 32 bit float when handled in this way. This all exists for legacy reasons, and because material values are serialized and Unity chose to add integer support in this automated way without changing the Material class itself once GPUs did support integers (the Material class predates that!).
The way around that limitation if you want more direct control over the bytes being sent to the shader is to use structured buffers via material.SetBuffer() or set entire constant buffers via material.SetConstantBuffer(). Both of which use a ComputeBuffer on the c# side and then passes the assigned bytes to the GPU as is.
Compute shaders of course came along well after integer formats were natively supported by GPUs, and there’s not a serialization layer the values are passing through, so the various computeShader.Set*() functions are already passing the unmodified byte values to the GPU as is.
I understand why it’s annoying that they’re different, but they’re different for a reason.
I understand the reasons. I’ve been programming in GLSL since 2006 and I know how difficult is to make your API future-proof and then keep it backwards compatible.
I do appreciate your explanations, though, so thank you for taking the time
So I just have a single shader property and I’m passing a ComputeBuffer with the structure from my C# script. I have all my actual properties in the structure itself;
There is a little downside to this - I now have to manage this ComputeBuffer in my code and remember to release it. For simple use cases having ComputeShader.SetVectorInt would be nice, since we have SetVector and other convenience methods there already
Hi!
We are aware of the API inconsistencies, but we have higher-pro things to work on.
There’s a new SetInteger API (2021.2+) that you can use - it’s backed by real 32-bit ints under the hood.
No Integer vectors yet, though