Is there a way to create a shader property in which we pass in an array of values?
uniform float4 valuableData[40];
Textures are NOT an option as I need the precision of a Single, not an 8-bit texture channel.
Is there a way to create a shader property in which we pass in an array of values?
uniform float4 valuableData[40];
Textures are NOT an option as I need the precision of a Single, not an 8-bit texture channel.
Not to my knowledge and its one of the more frustrating aspects of Unity shaders.
Its not clear to me what a ‘single’ is, but I assume you mean a float? In which case assuming you can’t use any other inputs such as uv2 or color, then you are stuck with textures, though there is no reason why you couldn’t pack the float data into a RGBA image and retrieve it in the shader. Although using a float is somewhat harder than say an int.
Maybe you should mention what kind of data this array contains? ^^
I want to influence mesh data in the vertex shader based on eigenvectors.
I did find this in the UnityCG.cginc file:
inline float4 EncodeFloatRGBA( float v )
{
float4 kEncodeMul = float4(1.0, 255.0, 65025.0, 160581375.0);
float kEncodeBit = 1.0/255.0;
float4 enc = kEncodeMul * v;
enc = frac (enc);
enc -= enc.yzww * kEncodeBit;
return enc;
}
inline float DecodeFloatRGBA( float4 enc )
{
float4 kDecodeDot = float4(1.0, 1/255.0, 1/65025.0, 1/160581375.0);
return dot( enc, kDecodeDot );
}
So I could conceivably pack a single-precision float into a single RGBA pixel… their EncodeFloat function doesn’t have a special case for 1.0, probably due to lack of branching on a GPU but if I build the texture on a CPU this may be adequate:
(note: I use python for proofing out concepts, so forgive my weirdness)
def EncodeFloatRGBA( f ):
if ( f - 1.0) < 0.000000001:
return [1,0,0,0]
kEncodeMul = [1.0, 255.0, 65025.0, 160581375.0]
kEncodeBit = 1.0/255.0;
enc = scale4(kEncodeMul, f)
enc = [modf(enc[0])[0],modf(enc[1])[0],modf(enc[2])[0],modf(enc[3])[0]]
enc = sub4(enc,scale4([enc[1],enc[2],enc[3],enc[3]], kEncodeBit));
return enc
Ah good find that.
I guess since all data is in floats on the gpu the maths is a lot easier. For some reason I was thinking of having to use fixed point math, which can be a nightmare to work out.
Definitely worth a try I would have thought, especially since there really isn’t any real alternative as far as i’m aware.
but even if you build the texture you won’t be able to sample from it in the vertex program.
Course you can, support for vertex fetch was added some time ago. Just make sure you set the shader to use ‘#pragma target 3.0’.
That’s good to know. I know it was possible in general, but it was not working in unity. Thank you for correcting me. Maybe now I can finally do perlin noise in the vertex shader by sampling from a noise file…
Yep it definitely works though if you need it to work in opengl you need to also add #pragma glsl. Test on win 7, but not on a mac.