Hi,
Below is a part of my first (ever) shader graph. It’s supposed to calculate the alpha value based on a pixel’s blue channel that represents when that pixel was modified. This pixel is set in the script by texture.SetPixel(x, y, new Color(r, g, Time.time / 1000), only called when I trigger it manually**.**
The alpha should be calculated as:
- 0 when the time is smaller than the (intended) change time,
- 1 when the time is greater then the change time + Appear After (an exposed property in the shader that set how long it takes alpha to change from 0 to 1)
- between 0 and 1, somewhat smoothly, when the time within the Appear After.
This works in the beginning when change time is 0. But after, some times it works, sometimes not at all, and sometimes takes a lot more than Appear After (the 0…1 alpha change is equal to Appear After. but it stays alpha 0 for a while before appearing).
Is something wrong with the graph, or is there an issue with the time needed for the texture update to be applied on GPU? Another possibility is that I get the blue wrong because of some texture interpolation (though the texture has no mipmap, not compressed and no filter, and is sized power of 2; and the r and g channels are accurate, at least to 0.01 precision).
Btw, the texture is small. just 2Kp and is changed only once in a while.