Hi guys and ladies,
So, I’ve got a weird touch detection issue. It’s abit hard to explain but I will give it a go. I’m trying to detect when a swipe gesture has been made. And so when the thumb for example, touches the screen, touch position is registered and when it has moved, I detect the distance made within the frame, if greater than threshold than swipe.
The weird issue is: when thumb touches screen, normally it would be the tip of the thumb, so if you were to place the tip of the thumb then place the rest of the thumb down onto the screen to the extent that the mid joint is touching the screen. My detection then tells me the thumb has moved (see “illustration" below).
O——O>
O for touch point, so the touch point/pos kinda shifts as your thumb fumbles around on the spot.
What I want is to implement it in a way where, I can distinguish a distinct swipe from a thumb shifting around on the spot.
Any suggestions? Thanks!!!