How to zoom camera in x amount of seconds

I’m trying to zoom in a camera by decreasing its field of view but to make things smooth im trying to decrease it a bit many times. Here is my current code:

 public static IEnumerator ZoomCamera(Camera cam, float goal, float time)
        {
            var watch = System.Diagnostics.Stopwatch.StartNew();
            float spliter = time / 0.1f;  // camera zooms in a bit every 0.1 seconds
            float difference = goal - cam.fieldOfView;
            for (int i = 0; i < spliter; i++)
            {
                yield return new WaitForSeconds(time / spliter);
                cam.fieldOfView += difference / spliter;
            }
            watch.Stop();
            float elapsedMs = watch.ElapsedMilliseconds;
            Debug.Log("Seconds Wanted: " + time);
            Debug.Log("Seconds Took: " + elapsedMs / 1000);
        }

The Debug.Log statements print out that the seconds took are more than the time wanted (probably because the code itself takes time to execute. I saw a few code examples that used Mathf.Lerp but i didn’t understand how to implement it.

Is there a way in which this process could take exactly the amount of time i want it to?

Thanks in advance,

There is no way you’ll have an EXACT timer, it depends even on the machine the code is running, but you can have something that is very precise:

public IEnumerator ZoomCamera(Camera cam, float goal, float time)
{
    var watch = System.Diagnostics.Stopwatch.StartNew();
    float start = cam.fieldOfView;
    float timer = 0f;
    while (timer < time)
    {
        timer += Time.deltaTime;
        cam.fieldOfView = Mathf.Lerp(start, goal, timer / time);
        yield return null;
    }
    cam.fieldOfView = goal;
    watch.Stop();
    float elapsedMs = watch.ElapsedMilliseconds;
    Debug.Log("Seconds Wanted: " + time);
    Debug.Log("Seconds Took: " + elapsedMs / 1000);
}

Here you see how you could do it using Mathf.Lerp. I’ve compared it to your method and measured the time of each one in various ocassions (let’s call this mine and the one you posted yours):

  • FieldOfView from 60 to 30 in 2 seconds: mine got 2.006 average and yours got 2.03 average.
  • FieldOfView from 60 to 30 in 10 seconds: mine got 10.005 average and yours got 10.15 average.
  • FieldOfView from 90 to 30 in 60 seconds: mine got 60.002 average and yours got 60.67 average.

I assume the problem on your code is the WaitForSeconds instruction, but it’s only a suspicion.

I hope this helps!