How to integrate UNITY with live video using OVRvision

Hi all,

A japanese dev team created an add-on to the oculus rift called OVRvision, which hooks up two webcams to the front of the rift. This creates a cool opportunity to combine virtual reality with live video.

How could one overlay live video feed from the OVRvision into a game environment? Hoping to use the oculus’s new positional tracking system in tandem with sixense and dexmo body and hand motion tracking devices.

Besides understanding the basics of a web stack, I don’t have any coding knowledge, but I want to fundamentally understand the basic technology behind (and difficulties of) an experiment like this. Any help is appreciated!

Use WebcamTexture and set each cameras webcamtexture to a quad renderer.material. child each quad to its respective gamecam and that will do what you need.