I’m implementing a custom camera class, and this of course involves the calculation of a view matrix, to pass to my shaders.

I am using the following code to calculate this matrix:

```
viewMatrix = Matrix4x4.LookAt(
camera.transform.position,
camera.transform.position + camera.transform.forward,
camera.transform.up
);
```

I’ve set up an instance of my custom camera and a regular Unity camera with identical transforms for testing: position = (-8, 5, -12), rotation = (0, 0, 0), scale = (1, 1, 1).

However, when I compare the view matrices, my matrices are slightly different from Unity’s, as seen in the frame debugger:

The values at (3, 0), (3, 1) and (3, 2) have the wrong sign for some reason, probably because I’m using the LookAt function incorrectly. Does anyone know what I’m doing wrong?

The view matrix is supposed to transform **from** worldspace **to** camera space. So you need the inverse of the transform of the camera since the normal transform transforms from the local space / camera space to worldspace and not the other way round… Though different renderers (DirectX, OpenGL, webGL, OpenGL ES) may require slightly different formats. Especially the z axis which is also inverted. This is often the case when we need to transform from Unity’s left handed system into the usual right handed system.

ps: To get the view / camera matrix from a Unity Camera, use the worldToCameraMatrix property.which takes care of all necessary tweaks. Of course when you roll your own camera logic you’re on your own.

EDIT: the original accepted answer is bad. Use this instead, and see the original post below the line and replies for an explanation.

```
viewMatrix = Matrix4x4.TRS(transform.position, transform.rotation, Vector3.one).inverse;
if (SystemInfo.usesReversedZBuffer)
{
viewMatrix.m20 = -viewMatrix.m20;
viewMatrix.m21 = -viewMatrix.m21;
viewMatrix.m22 = -viewMatrix.m22;
viewMatrix.m23 = -viewMatrix.m23;
}
```

Ok, so I seem to have found the solution, partially thanks to Bunny83’s suggestion about inverting matrices, partially through pure trial and error:

```
viewMatrix = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, new Vector3(1, 1, -1))
* camera.transform.localToWorldMatrix.inverse;
```

Appears to produce a view matrix identical to Unity’s, *as long as the camera object’s scale is left at (1, 1, 1)*, which is acceptable behaviour for my use-case.

I still don’t get why Matrix4x4.LookAt gave me wrong results, since afaik it takes world-space values as inputs and outputs a worldspace-to-viewspace matrix, but oh well.