Experiencing strange problems with Mathf.SmoothDamp function

So, here’s the code I"m using with Mathf.SmoothDamp:

currentAccuracy = Mathf.SmoothDamp(currentAccuracy,Accuracy,vel,RecoveryTime);

“vel” is a float that’s set to 0.0

Basically, it’s just setting your current accuracy to your accuracy over RecoveryTime. Or it’s supposed to do that.

RecoveryTime grows larger every time you fire, up to a maximum of 1 (so 1 second). I know that works because it stays under or equal to 1 in the inspector.

For some reason, that line of code only affects currentAccuracy if currentAccuracy is under 0. Then, it sets currentAccuracy up to 0 over RecoveryTime (I think it’s over RecoveryTime…but either way, it’s still behaving unexpectedly for me).
Why isn’t it setting currentAccuracy to Accuracy, and why is it only affecting currentAccuracy if currentAccuracy is under 0?

When I set SmoothTime to 1 instead of RecoveryTime, then it seems to work without requiring currentAccuracy to be under 0, but it doesn’t set it to Accuracy…it sets it slightly under it for some reason.

Now, when I was fiddling around with it, it seems to start spiking my FPS to 10 on every second frame and then letting it back up to 80 the next frame…

Can anyone tell me what I’m doing wrong?


Are you setting vel to 0 just once (or each time you fire,) then leaving it alone?

SmoothDamp uses vel as its memory of the previous speed. Each call, SmoothDamp adjusts vel, then , next call, uses vel as the “old speed.” If, say, it’s 0 each call, SmoothDamp never gets up to any speed. For fun, if you make vel public, you should see SmoothDamp changing it.

So simple yet so hard to discover…
Thank you very much! the above works.

Best change vel to zero continuesly if you are using smoothdamp