Why unity_Matrix_VP does not match what I've computed in script?

I’m working on a custom render pipeline using deferred rendering, in which I need to pass the inverse of VP matrix to reconstruct world position from depth. But things just never get right.


I set up an camera with fov=30, near=0.3, far=1000, position = [-5, 0, 10] and all other params set to default value. During the GBuffer pass I get the unity_Matrix_VP in frame debugger:

1.3128 0.0000 0.0000 6.5641

0.0000 -3.7320 0.0000 0.0000

0.0000 0.0000 0.0000 0.3031

0.0000 0.0000 1.0000 -10.000

Then I calculate the same VP matrix in the lighting pass by mat = camera.projectionMatrix * camera.worldToCameraMatrix , which gives me:

1.31283 0.00000 0.00000 6.56413

0.00000 3.73205 0.00000 0.00000

0.00000 0.00000 1.00060 -10.60618

0.00000 0.00000 1.00000 -10.00000

And they do not match, which ruins all following computation in the lighting pass.


I’m using unity 2022.2.0a9 and SRP 14.2. How can I get these things work right?

You may want to have a look at GL.GetGPUProjectionMatrix.