Difference between texture GPU update time and Time.time

Hi,

Below is a part of my first (ever) shader graph. It’s supposed to calculate the alpha value based on a pixel’s blue channel that represents when that pixel was modified. This pixel is set in the script by texture.SetPixel(x, y, new Color(r, g, Time.time / 1000), only called when I trigger it manually**.**

The alpha should be calculated as:

  • 0 when the time is smaller than the (intended) change time,
  • 1 when the time is greater then the change time + Appear After (an exposed property in the shader that set how long it takes alpha to change from 0 to 1)
  • between 0 and 1, somewhat smoothly, when the time within the Appear After.

This works in the beginning when change time is 0. But after, some times it works, sometimes not at all, and sometimes takes a lot more than Appear After (the 0…1 alpha change is equal to Appear After. but it stays alpha 0 for a while before appearing).

Is something wrong with the graph, or is there an issue with the time needed for the texture update to be applied on GPU? Another possibility is that I get the blue wrong because of some texture interpolation (though the texture has no mipmap, not compressed and no filter, and is sized power of 2; and the r and g channels are accurate, at least to 0.01 precision).

Btw, the texture is small. just 2Kp and is changed only once in a while.

Assuming the texture has no compression at all, is sampled with point filter with mip 0, and blue is correctly retrieved you are probably facing float precision issues (divide and multiply by 1000) - this is my guess.
However there is one more thing - the texture format. If your texture uses default one I think you can encode like 8 bits per channel, what means only 256 possible values.

By the way setting pixels this way is really not performant, why you simply don’t pass single value as shader param?

Thank you.

Yes it was a bit-count issue, I was oblivious to that the float channels in my shader are (float)(byte)(float channel) from the script :roll_eyes:. I now use all three channels to pass the change time with 100ms precision.

I read somewhere that it’s not possible to pass arrays into shader graph, is it? So, I need to pass them (4Kp data or 12KB) as texture to the shader. Of course each fragment only uses 4 pixels depending on its UV.

It is, but shader graph does not support that out of the box, custom node is required.