Compute depth 01

Hi guys, I don’t understand how unity calculate depth 01. Formula is described in CgInclude is :

#define COMPUTE_DEPTH_01 -(mul( UNITY_MATRIX_MV, v.vertex ).z * _ProjectionParams.w)

Why not is

#define COMPUTE_DEPTH_01 (mul( UNITY_MATRIX_MV, v.vertex ).z * _ProjectionParams.w)

without subtract.

Probably because by, I believe, OpenGL convention, the z component for view space positions in front of the camera are negative… So the minus sign should actually bring them back to positive values.

I got it, thanks you