Using Time.deltaTime with Lerp not working as expected

I’ve been trying to debug this lerp code and the only possible cause that I can see is that Time.deltaTime is not returning a correct value.

I have a rain particle system that has an intensity value ranging from 0 (no rain) to 1 (max rain).

LerpRainIntensity is the method I call when I want to transition linearly to a new rain intensity.

But, I want the transition rate to be consistent, not being affected by the size of the transition. So if it takes 4 seconds to go from 0 intensity to 1 intensity it should take 2 seconds to go from 0 intensity to 2 intensity.

So

    (Mathf.Abs(targetIntensity - startIntensity) * rainTransitionTime)

should be the fixed amount of time that this transition takes (because targetIntensity and startIntensity are Clamped between 0 and 1), and my debug statements have shown that this works correctly.

Then if Time.deltaTime is the time between frames, dividing it by the total time for the lerp should get me the amount to increment rainLerpT by, right?

But, when I run this code with a transition time of 100 seconds, it takes less then 4 seconds for the lerp to complete somehow. All of the values that I print out seem normal, although Time.deltaTime does seem a tiny bit fast for what the Stats in the scene view tells me my frame rate is.

Do you see the problem with this? Any help would be greatly appreciated.

    [Range(1.5f, 5f)]
    [Tooltip("Number of seconds it takes to transition from no rain to max rain")]
    [SerializeField]
    private float rainTransitionTime;

    private float targetIntensity;
    private float startIntensity;
    private float rainLerpT = -1;

    public void LerpRainIntensity(float newIntensity)
    {
        newIntensity = Mathf.Clamp(newIntensity, 0f, 1f);

        rainLerpT = 0;
        startIntensity = RainIntensity;
        targetIntensity = newIntensity;
    }

    // Called every Update
    private void LerpRainIntensityUpdate()
    {
        if (rainLerpT < 1f && rainLerpT >= 0)
        {
            rainLerpT += Time.deltaTime / (Mathf.Abs(targetIntensity - startIntensity) * rainTransitionTime);
            RainIntensity = Mathf.Lerp(RainIntensity, targetIntensity, rainLerpT);

        }
    }

From the looks of it, your problem lies in the combination of these 2 lines:

rainLerpT += Time.deltaTime / (Mathf.Abs(targetIntensity - startIntensity) * rainTransitionTime);
RainIntensity = Mathf.Lerp(RainIntensity, targetIntensity, rainLerpT);

Not only are you linearly increasing the t value manually, but you are also using it in a recursive lerp, meaning there won’t be a consistent interpolation. In your case, because you want the lerp to take a set amount of time, you would want to do a regular straight interpolation rather than a recursive one.


To do that, all you need to do is set the A value of the lerp to a constant starting value (like the one you define earlier in the code sample);

rainLerpT += Time.deltaTime / (Mathf.Abs(targetIntensity - startIntensity) * rainTransitionTime);
RainIntensity = Mathf.Lerp(startIntensity, targetIntensity, rainLerpT);