Hi all,
A japanese dev team created an add-on to the oculus rift called OVRvision, which hooks up two webcams to the front of the rift. This creates a cool opportunity to combine virtual reality with live video. http://ovrvision.com/
How could one overlay live video feed from the OVRvision into a game environment? Hoping to use the oculus’s new positional tracking system in tandem with sixense and dexmo body and hand motion tracking devices.
Besides understanding the basics of a web stack, I don’t have any coding knowledge, but I want to fundamentally understand the basic technology behind (and difficulties of) an experiment like this. Any help is appreciated!