I’m trying to slow down a touch finger move as I get farther away from the start position. Here’s what I currently have, which keeps the same speed no matter what:
using UnityEngine;
using System.Collections;
public class Swipe : MonoBehaviour {
public float speed = 2f;
public Vector2 moveAmount;
public Vector2 startPos;
public float minSwipeDist = 30f;
public float comfortZone = 70f;
public float swipeDirection;
void Update ()
{
if( Input.touchCount > 0 )
{
// Store GetTouch(i) into "touch" variable...
Touch touch = Input.touches[0];
// Store the swipe distance
float swipeDist = (touch.position - startPos).magnitude;
switch( touch.phase )
{
case TouchPhase.Began:
startPos = touch.position;
break;
case TouchPhase.Moved:
moveAmount += touch.deltaPosition * speed * Time.deltaTime;
**moveAmount *= (minSwipeDist - swipeDist) * 0.01f;**
print("Move Amount: " + moveAmount );
break;
case TouchPhase.Stationary:
moveAmount = Vector2.zero;
break;
case TouchPhase.Ended:
//print("Swipe Distance: " + swipeDist );
if( swipeDist > minSwipeDist )
{
swipeDirection = Mathf.Sign( touch.position.x - startPos.x );
}
break;
}
}
}
}
The BOLD line in the code above is where I’m stuck. I’m trying to multiply the moveAmount Vector2 by the distance since the swipe started so that “moveAmount” gets lower and lower as I slide the finger across the screen, slowing down the swipe.
I know I’m missing something silly, but I don’t see it yet…
Thanks in advance for any help, greatly appreciated as always!