Compare Between Saved Data and Live Data

Hello,

I have some coordinates saved in an XML file (x, y, z), and I want to be able to compare them to live coordinates within Unity, with a give or take (threshold) slack.

Basically, I’m teaching Unity to learn hand gestures from an accelerometer. This data is pre-saved within an XML file. Now I want to forever compare these XML coordinates against the live accelerometer data.

Now, obviously the hand gesture will never be the same as the pre saved gestures, hence why I want a ‘give or take’ / threshold slack.

When the XML data and the accelerometer data match up, just a simple print(“Gestures Match”), would be fine.

//The array of the loaded XML coordinates of the predefined gesture
private var recordedX = new Array();
private var recordedY = new Array();
private var recordedZ = new Array();

//Live accelerometer data being fed into Unity
var X = 0;
var Y = 0;
var Z = 0;

---- Just To Be Clear ----

9533-data_xml.png

So in this diagram, I’m only using X and Y as an example. But as you can see, the blue and red lines are live values from the accelerometer. The red values are not consistent, so they are just ignored. Whereas the blue values, although not ‘exactly’ spot on with the predefined XML gesture values, are very closely inline with the threshold, and so, throws a print.

How would one achieve this? What is this structure called? Is it just a bunch of for and if statements? Or does it require some seriously complex algorithms?

Thanks!

If it was like hand written gestures you’d do a lot of normalization of the sizes of each sampled movement and store the angle and proportional length between each of the sample points then you’d score the match between the incoming data and sort the gestures in order. Then pick the best and reject it if it was below some minimum good level.

By normalizing you make it the movement that’s important not the size of the movement. Though clearly it’s easier for the player to reproduce gestures slower and bigger.

To normalize:

  • Take a number of points sampled throughout the gesture, at constant time, to avoid bunching
  • Find the longest movement in the sample set and divide everything else by that.
  • For angles, take the angle between the vector of the previous sample and the new sample - again scale it to be 0…1
  • Then you can choose which is more important by weighting the length or the angle by some factor in your scoring - or leave them the same to have equal weighting.