Changing the Play Rate (Frequency) of a sine wave in real-time

I am trying to create a shader similar to the one used in ABZU to animate fish. So far I have this:

And here is my code so far:

v.vertex.x += sin((v.vertex.z + _Time.y * _PlayRate) * _Frequency) * _RollAmplitude;
			if (v.vertex.z > _MaskOffset)
				v.vertex.x += sin((.05 + _Time.y * _PlayRate) * _Frequency) * _TranslateAmplitude * _MaskOffset;
			else
				v.vertex.x += sin((v.vertex.z + _Time.y * _PlayRate) * _Frequency) * _TranslateAmplitude * v.vertex.z;

I have the movement down fine, but I want to be able to change the “Play Rate” of the “animation” that the shader is making in real-time based on what the fish’s actual speed is. Doing this now, creates a jitter, seen here:

This gif doesn’t do it justice. There is much more jerkiness when changing the play rate variable.
I have been trying a bunch of different solutions including using a different variable for time and such, but I can’t seem to get it. Many solutions I find are for C# scripts and can’t be translated to the shaders well, if at all. Is there any way to get a sooth transition in the shader?

this is a common issue.

sin(t * rate) will naturally give you significant discontinuities when t is non-zero and rate changes.

i’m not sure about doing this in a shader context, but in general code you should instead keep a separate variable that increments as dt * rate.

so eg

private float _f = 0.
....
void updateStuff():
  _f += dt * rate.
  displacement = sin(f).

apologies for the brevity & formatting, i’m on mobile