
Our 8th semester project; the Sorcerer Simulator.
We used the Microsoft Kinect in tandem with the SoftKinetic Iisu plugin, for quick integration with the Unity game engine. We then made our own custom Naive Bayes Classifier(a plugin written in C++, which handles gesture recognition) and compared it with the parametric Iisu Interaction Designer package. As with most of our projects, this is a prototype that only runs with a licensed Iisu SoftKinetic plugin (Until Microsoft releases their SDK, things might change at that point).
Authors are students of Medialogy at Aalborg University, Denmark.
Hope you enjoy =)
Greatz.
Nice job. =)
How do you start with gesture reconigtion? Neural networks? Or just a plugin?
Do you use OpenNI?
Can you give me a tip?
Well yes “just a plugin”, but its a custom C++ plugin we wrote ourselves. We obtain the difference in position of both hands over a set of frames. That is our trainin data for the plugin. So, basically what we feed the gesture recognizer (or classifier) is a set of small vectors over a period of about 20 frames. This is fed to the classifier continuously. Neural Networks seemed to be overkill for a set of gestures this small, so we used the Naive Bayesian classifier and wrote a trainer for it also. We do not use OpenNi. We found something called SoftKinetic Iisu, which could deliver tracking data to us very very fast. This was before Microsoft released the SDK, so if I were you I would try using that instead. Good luck =)
hi,
I just would like to know how do you get the video stream from the kinect and place it on your scene. Thanks for your answer.
Hi Teddosh49
We dont do that in this application. If youre reffering to the player in the left side of the video, then that is only a video editing trick. I dont have any experience with displaying the video stream ingame, sorry. Im sure others have tried this, search around a bit =)