Ok if someone can please explain it, because I’m having trouble understanding it. I am not sure how my calculations aren’t similar to the effect in Unity when i run the project. As i read the linear interpolation takes is based on formula (y-y0)/(x-x0) = (y1-y0)/(x1-x0) given we have two point A(x0,y0) and B(x1,y1). Also I have found that for singular values formula for Lerp is x0 = from, x1 = to, t = fraction so the formula is *x0 + (x1-x0)t.
So basically I want to use Lerp between values 0-90. Based on Unity documentation on Lerp and also the tutorial example I’ve seen they use the Time.deltaTime as t usualy multiplied with some smooth factor. My question is next:
float startValue;
float resultValue;
float smooth = 4f;
void Update()
{
if(Input.GetKeyDown(KeyCode.A)){
resultValue = 0f;
toValue = 90f;
}
resultValue = Mathf.Lerp(resultValue, toValue, smooth*Time.deltaTime);
}
The Time.deltaTime is as I get it now around ~0.02 sec multiplied with 4 that’s 0.08~0.09. Before trying to apply the formula, based on documentation and some posts I read the value of t is between 0-1. If it’s 0.5 it returns half way of between values, so between 0 and 90 it should return 45. If t=0.08 is that 8% of the 90? If that so it should be 7.2 for first iteration of Update(). Instead I get value of 5.96?!? Next I tried to use formula it is the same as my fist thought.
So how does it work? Also I’ve noticed there is a few scenarios of using the Lerp(), one like this that I’m using, the one where you have start time then calculate t based on maxValue/currentValue, and I think there was another one but i cant remember now.
Thank you in advance, cheers!!!