After the integration of a webcam getting a person moves and putting on Lerpz has been shown (http://www.youtube.com/watch?v=sK6DYjRERtU) another version has been created and now works with Unity 2.6.
This time head translation and rotation have been communicated in order to create new ways of interaction. These videos show how it works:
http://www.youtube.com/watch?v=98586S9hcDc and http://www.youtube.com/watch?v=N87c5wxD9EQ.
I hope you enjoy.
More information is available at www.visionblaster.com.
Looking good and smooth!
I’d like to give this a try.
There’s always the question of how useful this way of interaction is in the end. Since you’re moving your head away from the screen, the whole idea of turning becomes a bit awkward in my opinion.
I’ve thought about ways of using this since I first saw the ‘jonny lee’-videos about the whole wii-remote and webcam interaction.
So far I think the only way that’s really appealing to me is creating an illusion of looking at a window rather than a computer screen.
So in 3rd person view for example, when you move your head (in the real world) in front of the screen, the camera angle towards your character changes accordingly, giving the impression of looking through a window into the game world.
I’ve been playing with this about a year ago in Gamestudio and I was quite pleased with the result. But so far I haven’t tried to use it in a game yet. I’d like to give it a try in Unity one day.
Here’s what I came up with. (I’m only detecting two white spots in front of the camera and calculating the position of the source that way)
Pretty impressive how smooth the face tracking is.
+1 on this. I have no idea why this isn’t more popular yet. I’ve owned the shutter glasses, and the effect just sucks and makes me dizzy. I keep thinking there must be some similar negative that prevents this effect from catching on.
How to walk?
I owned those glasses too and was totally amazed by the effect, alltough it was useless at that time. Not good enough for games and no movies to watch in 3D.
From what I’ve read the new glasses are supposed to be much better but I’m going to wait for polarized screens, that seems to be the way of the future (and hopefully is going to be supported properly).
As for the ‘window-effect’: I don’t really see why it wouldn’t work, other than supporting every possible webcam and pc hardware is probably difficult. (X-Box and Natal is going to change that I guess)
The effect is also a bit awkward because it’s still a flat screen. I tried red/blue 3d-glasses on it though and that really makes it believable.
So combining a webcam based window effect and 3D glasses could make this a lot more popular!
[quote=“”]
How to walk?
[/quote] Probably by using the arrow keys.
suprised you didn’t go for a jonny lee and use it for a more relalistic 3d perspective…are you wearing glasses that are aiding the tracking or is it just going from tracking your face by contrast?
Wow, that is exactly what I was thinking!
I had an idea that kinda seems a little random. But, what if you used that system in an FPS but have the mouse control rotation of the character, have the camera be a child object of the character, and attach that system’s script to the camera. That way the camera rotates in it’s place like you’re moving your head, and the mouse is moving your torso or arms!!! It’ll have to have extreme testing and I would love a copy of that script so I could try that out! It looks great!
Moving a character’s head (with real player head movement) independently from the body (which is using Mouse/keyboard movement) can work really well.
See this video for a great example:
I’d love to see TrackIR support in Unity.
That is really quite impressive.
The head movement is very smooth which I think is the most important factor.
If you can’t see enemies or use in-game displays because it’s all shaky than the whole thing would be pointless.
I still don’t like the idea that I’m turning my head (to look to the side) but my eyes are still focused to the front.
But it sure is the best solution for whats possible now.
I’d prefer the faceAPI solution for Unity though because all it takes is a webcam and no other device or thingi to put on your head.
You should post the code for free since you are not allowed to charge for it according to the API documentation.
Actually, the code for streaming the data is open source:
http://code.google.com/p/6dofstreamer/
You can download and change the code as you want.
Visionblaster is charging the udp sockets Unity3d integration, only. And streaming can be done with different software, like ehci, for instance.
Kind regards,
Daniel
hmmm… you implemented faceAPI for head tracking and translating this to movement in 3d space… could it be used for kind of an AR app, where you would stick 3d objects to the face… like a 3d hat or something? I supppose that, if you get 3d info from this it could be done?
Would this be hard to implement??
how to do this one actually i dont have any idea about this but i want to do dis with ur help…tell me step by step process,wt r the requirements needed…
plz help me…
Hey danny,
do you have this working with the front facing camera on ios? im looking at creating somthing ike holotoy and could do with a little help.
Please get in touch
warren@mocapone.com
did you get this working on IOS with front facing camera?