Making Virtual Reality in Unity with the Kinect.

I’m trying to recreate this project: http://www.youtube.com/watch?v=Jd3-eiid-Uw
Instead of wii controls I’m using the Kinect.
The location of my head has the same coordinates as the camera in Unity. But I don’t get how he keeps the corners of his box alligned with the corners of the tv.

This is also a good example: http://www.youtube.com/watch?v=2MX1RinEXUM
So my question is: What does he do with the camera? I know he moves it according to the movement of the user head but I’m sure he also does something else. So what is it? Something with field of view or so?

I think they’re just moving the camera around. The wii remote one at least is rather limited. If you look closely you’ll notice that all the objects are perpendicular with the screen. To get a perfect 1 to 1 correlation would require more complex rendering I think. For example, let’s say that you are looking at the monitor at a 45 degree angle. You want to create the illusion that the monitor is a window into your game world so the camera would need to move to your position in the game(45 degrees off to the side). The only problem is that it’s rendering to a screen perpendicular with itself instead of where the screen would be if it were actually a window. I wouldn’t think that Unity can do this. If you’re mainly going to be looking at the screen straight on then this may not be a problem but from any angles I think it would look …off. Does this make sense to anyone? It’s kind of hard to explain.

Anyways, you can also look into TrackIR and Free Track as they do the same thing as in those videos but designed with games in mind.

Yeah i fully understand, and i nearly perfectly copied his creation. Instead of using a Wii remote I used the kinect. This is tracking my head, in 3d space. So I rebuild the real life space in Unity, and in my understanding it should be exactly the same, but it isn’t. My field of view is off or something i dont know. I could film it if you are interested to see my problem ?

I believe that the source code for the Wii remote example is available isn’t it? I think I remember seeing it on the guys website. I didn’t look at it though and this was over a year ago. But again, his example is simple…though effective.

I think I understand your problem and it goes back to what I was saying before. If you imagine the monitor as a virtual window that the camera is rendering to, then it makes sense that as you move around then the camera would move to match…BUT…the “window” that the camera renders to should not move. Thinking of it this way, in Unity it moves with the camera a certain distance out and perpendicular to the vector where the camera is facing. You might be able to fake it by adjusting the fov based on your distance from the screen. The equations shouldn’t be too difficult.
Off the top of my head… fov = 2(asin(D / W/2))
Where D is the distance to the screen and W = the width of your monitor.(or maybe try height?)
I’d start playing around with variations of that and see if it looks any better. The units, I would think, should be in whatever matches your game world.

It still doesn’t address my previous comments but it’s something to try at least.

Thanks for thinking with me on this :slight_smile:
I’ll try the given formula tomorrow, and let you know if it works.

DEVELOPER NOTES

The interesting part of this code is the calculation of the offcenter projection.

//when space is pressed, the camera angle is calculated in OnKeyPress------------------------
if ((int)(byte)e.KeyCode == (int)Keys.Space)
{
//zeros the head position and computes the camera tilt
double angle = Math.Acos(.5 / headDist)-Math.PI / 2;//angle of head to screen
if (!cameraIsAboveScreen)
angle = -angle;
cameraVerticaleAngle = (float)((angle-relativeVerticalAngle));//absolute camera angle
}

//here all the head parameters are calculated in ParseWiimoteData()------------------------------
float dx = firstPoint.x - secondPoint.x;
float dy = firstPoint.y - secondPoint.y;
float pointDist = (float)Math.Sqrt(dx * dx + dy * dy);

float angle = radiansPerPixel * pointDist / 2;
//in units of screen hieght since the box is a unit cube and box hieght is 1
headDist = movementScaling * (float)((dotDistanceInMM / 2) / Math.Tan(angle)) / screenHeightinMM;

float avgX = (firstPoint.x + secondPoint.x) / 2.0f;
float avgY = (firstPoint.y + secondPoint.y) / 2.0f;

headX = (float)(movementScaling * Math.Sin(radiansPerPixel * (avgX - 512)) * headDist);

relativeVerticalAngle = (avgY - 384) * radiansPerPixel;//relative angle to camera axis

if(cameraIsAboveScreen)
headY = .5f+(float)(movementScaling * Math.Sin(relativeVerticalAngle + cameraVerticaleAngle) *headDist);
else
headY = -.5f + (float)(movementScaling * Math.Sin(relativeVerticalAngle + cameraVerticaleAngle) * headDist);

//here is the projection in SetupMatrics()------------------------------------------
float nearPlane = .05f;
device.Transform.Projection = Matrix.PerspectiveOffCenterLH( nearPlane*(-.5f * screenAspect + headX)/headDist,
nearPlane*(.5f * screenAspect + headX)/headDist,
nearPlane*(-.5f - headY)/headDist,
nearPlane*(.5f - headY)/headDist,
nearPlane, 100);

I fount this from Johnny Chung Lee’s website. But it doesn’t make much sense to me now :confused:
Gonna look at it tomorrow, but if someone could ellaborate already, be my guest :slight_smile:

I started searching for offcenter projection and I found this video: http://www.youtube.com/watch?v=gz84_A-TpLg
And also found the Camera.projectionMatrix Funtion.

Don’t if this is what I’m looking for though.

Does someone know how to create something like the video above? do I need the Camera.projectionMatrix Funtion ?