We are experimenting a bit more with FaceAPI and Unity3D. This time we are attempting to control gestures for position, rotation, moth and eyebrows from a webcam. It was a quick mock up, but it might be promising. Now we need to work a lot on defining bones to match the tracking points that FaceAPI provides, create a smoothing script, etc.
Neat. I’ve been messing about with it and it’s surprisingly accurate. Not as good as an IR light solution but considering it’s still very usable.
I read on their site that they are working on a C# interface, but I thought it wasn’t released yet? How are you interfacing to it? I’m just going through a fake joystick to do some tests.