I’m trying to zoom in a camera by decreasing its field of view but to make things smooth im trying to decrease it a bit many times. Here is my current code:
public static IEnumerator ZoomCamera(Camera cam, float goal, float time)
{
var watch = System.Diagnostics.Stopwatch.StartNew();
float spliter = time / 0.1f; // camera zooms in a bit every 0.1 seconds
float difference = goal - cam.fieldOfView;
for (int i = 0; i < spliter; i++)
{
yield return new WaitForSeconds(time / spliter);
cam.fieldOfView += difference / spliter;
}
watch.Stop();
float elapsedMs = watch.ElapsedMilliseconds;
Debug.Log("Seconds Wanted: " + time);
Debug.Log("Seconds Took: " + elapsedMs / 1000);
}
The Debug.Log statements print out that the seconds took are more than the time wanted (probably because the code itself takes time to execute. I saw a few code examples that used Mathf.Lerp but i didn’t understand how to implement it.
Is there a way in which this process could take exactly the amount of time i want it to?
Thanks in advance,