I have a bit of a math problem that I’m having a hard time wrapping my head around. I’m using the event trigger OnDrag to move a piece of UI around the screen while the user is dragging a finger. I calculate the mouse position in worldspace (using cam.ScreenPointToWorld), subtract the previous mouse position in worldspace, and then divide this value by Time.deltaTime to find the velocity of the object on a per frame basis. I then add this value to the existing UI’s position in order to find the new position. (For what it’s worth, I also multiple the velocity by cam.SceenPointToWorld(Vector3.right) - cam.ScreenPointToWorld(Vector3.zero)).
When the event trigger OnDragRelease is fired, I know the velocity of the UI when the finger released their finger. Now, I’d like to keep the UI element temporarily moving with the same velocity, but slow down it down as it moves toward a known position over a know number of seconds. I’m familiar with lerp, but I don’t think this is what I want to use, as the change in velocity will not be linear.
Any thoughts on how to get this object to the right place, in the right amount of time, while starting with an initial velocity? Thanks in advance!