Hello all,
I made a simple script to adjust the camera’s FOV depending on where the lowest object in my scene is located along the Z axis. Here is my code :
private Camera _camera;
private float _fov, _currentVelocity = 0.0f, _smoothTime = 0.3f;
private Vector3 _objPos;
public float padding = 1.0f;
private void Start()
{
_camera = GetComponent<Camera>();
}
private void Update()
{
_camera.fieldOfView = Mathf.SmoothDamp(_camera.fieldOfView, _fov, ref _currentVelocity, _smoothTime);
}
public void SetFov()
{
var theta = Mathf.Atan2(_camera.transform.position.z - _objPos.z - padding, _camera.transform.position.y) * Mathf.Rad2Deg;
var camRot = _camera.transform.localEulerAngles.x;
_fov = -2 * (theta - camRot);
}
public void SetObjectPosition(Vector3 pos)
{
_objPos = pos;
}
So I am trying to calculate the FOV angle relative to the distance between the camera and the beginning of the camera viewport. Here is a diagram to illustrate this :
I am sure I made a silly mistake somewhere, would really appreciate it if someone could point it out for me.
Thanks.
