Fragment fade colour to colour over time

Hi

I am trying use a render texture to allow painting of a coloured line based on mouse position.
The drawing part is working fine, however I would like the drawing to fade back to the base colour over time, much like a fading trail.

My approach has been to apply the line in one shader, then in another to continually try and apply a solid colour that matches the background, to every pixel.

I thought that in the frag function, returning col * _Color * 0.5, where _Color is the same as background colour, would work, but actually this only ever dims a white line to grey.

Does anyone have a better suggestion for how to achieve this trail effect behind a drawn line… Ultimately I want the trail to fade very slowly so the rate of fade needs to be configurable.

Thanks in advance.

OK forget the complexities of original question. How about how do you change a colour value at a rate different to the framerate.

Eg I I just modify the maintex colour r component += 0.1 it is maxed out in a second.

Let’s say I want to increment the r component so that it takes 1 minute to go from 0 to 1…

As I understand it, the channels do not have the precision to allow very small changes so eg += 0.001 does nothing as it is rounded to 0.

Any pointers please?

See Time under Built-in shader variables.

Using _Time just gets me an ever increasing number. I still can’t apply a small enough value to the color component for it to change gradually enough.

I tried something along lines of

if _Time.y % 10 == 0 to try and get a tick every 10 seconds but no joy

If you’re storing a high precision value in a low bit data type, then sure it’ll quantize. Without seeing your code, couldn’t you do something like this;

color = lerp(foregroundColor, backgroundColor, time);```

Where *time* is a high precision value stored instead of pixel color?

I can’t Lerp because I need to apply a uniform Color delta to every pixel regardless of its current Color.

I need to always be trying to bring every pixel back to black by reducing each component of each pixel by a steady fixed amount - the effect is exactly what I need, its just too fast.

If fixed4 Color was high enough precision I could just do col.r -= 0.0001 for example and that would probably be slow enough. Since it is not high enough precision, I have to instead decrease value by a larger amount, less frequently, to achieve same effect.

Seems like a simple thing, apply changes to Color at a slow rate…

What I have at the moment

Thinking is it should decrement by 0.05 every second…

_ChangeTime ("ChangeTime", float) = 0
.
.
fixed4 frag (v2f i) : SV_Target
            {
                fixed4 col = tex2D(_MainTex, i.uv);

                bool isTimeToTick = _Time.y - _ChangeTime > 1;
                _ChangeTime = isTimeToTick ? _Time.y : _ChangeTime;

                float decrement = isTimeToTick ? 0.05 : 0;
               
                col.r -= decrement;

                return col;
            }

My thinking was to store an elapsed time in a separate but higher precision texture which you then use to lerp between your start and end colors.

Similarly, try changing fixed4 to half4* and use RGBAHalf* to work directly with higher precision color.

* half should be sufficient, but you could use float for more granularity if req

So along those lines:

float colFloat = EncodeFloatRGBA(col);
colFloat -= 0.001;
col = DecodeFloatRGBA(colFloat);

Doesn’t seem to change ever, whereas

float colFloat = EncodeFloatRGBA(col);
colFloat -= 0.01;
col = DecodeFloatRGBA(colFloat);

Changes way too fast, so I think I have a precision issue still, not sure I am using the Encode/DEcode correctly though…

RGBAHalf is a texture format. Change your texture to something higher. RGBAHalf or if you’re only using the single red channel you could use RHalf to save memory.

You’ll also need to set half types in the shader, including the sampler;
sampler2D_half _MainTex;

So this got me thinking - I just changed the rendertexture format to R32_SFLOAT, and instantly I got higher precision!

Now col.r -= 0.0001 or similar gives me the slow change I need.

Thanks @grizzly for sticking with this, is very much appreciated!

Not a problem!

For completeness, if you’ve not updated the precision in the shader then it’s because you’re targeting desktop where all types are high precision (32 bit); Be aware if you deploy to mobile where low precision is used.

Also, if you’re rendering in sync with frame rate then the speed of the animation will vary. You might want to replace the constant with something like this;

color.r -= unity_DeltaTime.y * _Seconds; ```

This will be desktop only.

Fluctuations in animation won’t be a concern, but I appreciate the additional info.