# Why is my camera shaky in editor but not in compiled project?

I wrote this somewhat simple camera script that works silky smooth in my compiled project, but jerks around a lot in the editor. I can’t figure it out. The reason it’s a problem is because some computers do it and some do not and it usually has nothing to do with the speed of the computer.

It seems to jerk at very random times. I can’t even see it in the profiler what might be causing it.

The thing that seems to be failing is the Vector3.Lerp function. I tried using a smooth Delta Time, but that didn’t make any difference.

Any help on this will be very appreciated as I am practically tearing my hair out on this one.

Problem is this line :

``````  qCam=q1*Quaternion.Euler(1,0,0);
``````

you’re offsetting the camera rotation by 1 on x every update… so the camera rotation glitches then smooths back out…

I’m not sure really why you’re doing that there at all? maybe you meant to do something like?

``````  qCam=Quaternion.Slerp(qCam,target.rotation * Quaternion.Euler(1,0,0),Time.deltaTime*6.0);
``````

I think you shouldn’t use Time.deltaTime for Slerp, but use `Time.time*modification` instead.

Time.deltaTime is just the time of the last frame and will vary. But for Lerp you need a value which “fraction” runs continually in one direction.

For example, if you interpolate between two values, lets say 1 and 10 and you have

``````Quaternion.Slerp(qCam,target.rotation,0.5f)
``````

Then it would interpolate to half way between 1 and 10, so it will interpolate to 5. If you have it 0.25f then it will interpolate to 2.5 etc.

But if you use `Time.deltaTime*6` you will get seamlessly random numbers, because deltaTime will not be increased continuously but go up and down between each frame. Time.time increases with every frame, it never goes down.

The “down” should be the one which cause the camera to act jumpy.

Edit:
Oh btw. Use use Update() for normal camera movement and LateUpdate() for following camera. Because the camera must move moved after the character (the scene) has been moved/updated

Edit2:
Of course you can also do the following for the interpolation, if you want the interpolation to last 2 seconds until the camera gets interpolates from start to end:

``````float timePassed = 0;
...

void LastUpdate() {
timePassed += Time.deltaTime;
Quaternion.Slerp(qCam,target.rotation,timePassed/2.0f)
}
``````

So after 0.5 seconds the value will be 0.25, after 1 second 0.5 and after 2 seconds 1.0.

Edit3:
Since Rod from the comments doesn’t understand the correct usage of the 3rd parameter from the Lerp/Slerp functions, here an explanation for him:

``````using UnityEngine;
using System.Collections;

public class LerpTest : MonoBehaviour {
public Color color1 = Color.red;
public Color color2 = Color.green;

// Use this for initialization
void Start () {
renderer.material.color = color1;
}

// Update is called once per frame
void Update () {
// This will work, it gets from red to green and then stops
renderer.material.color = Color.Lerp(color1, color2, Time.time);

// This won't do anything! It will remain red
renderer.material.color = Color.Lerp(color1, color2, Time.deltaTime);

// This will cause flickering, like the jerking described from Rush3Fan. In my case its a color in his a movement, the effect is the same!
// That's because delta time is usally very low, 0.001-0.005. If you have a fast enough
// it will remain almost zero and the value won't be interpolated at all! If you multiply it by 10-20, it will turn into 0.01-0.05, and will be more visible in this case
renderer.material.color = Color.Lerp(color1, color2, Time.deltaTime*20);
}
}
``````

After all, the correct way of using interpolation is still set a variable to zero when the interpolation starts and do

``````myInterpolationValue += Time.deltaTime / duration; // i.e. if you want the interpolation to take 5 seconds, you **divide** by 5, NOT multiply. So it will take 5 seconds until myInterpolationValue becomes >= 1.0
``````

But most people do this kind of interpolation:

``````transform.position = Vector3.Lerp(transform.position, target.position, 0.5f);
``````

This LOOKS like it works as intended, but it doesn’t! The reason why it looks is, because the distance from transform.position and target.position gets lower with each update, that’s why it looks like the it “smoothes” down. Of course, you can use it like this, but in that case the 3rd parameter has to be a constant value between 0.0f (zero) and 1.0f (one!). The lower the value, the longer the object (in this case the camera) will take to reach it’s destination.

That being said, it’s not appropriate to downvote others answers, just because you don’t know how Lerp/Slerp works and is to be used.

I think I might have it figured out. I thought I was supposed to be using LateUpdate(), but putting it in Update() reduced the shaking quite a bit. Not completely, but close enough so that people wont get headaches playing my game.

Looks like my camera system is turning out pretty nice. Thanks for all you help Rod Green!