How do I implement "inverse(float4x4 A)" in CG for a shader?

Hi, I’m writing a shader but it seems as though the “inverse” function (used to invert a matrix in CG shader language) isn’t present in Unity. In NVIDIA’s CG reference manual, they give the implementation details for “inverse(float2x2 A)” which is obviously for 2x2 matrices, but I need the implementation for a 4x4 matrix. Can somebody please give me the implementation for this in CG?

Calculating the inverse of a matrix is a rather complex process. The reference implementation given by Nvidia uses the adjugate forumla. However the cofactor matrix grows extremely in complexity the higher the dimension get. This approach is only reasonable up to 3x3. The 4x4 formula is already too complex and other inverse processes are cheaper. Though these are quite difficult to implement in a shader as it requires a lot of branching.

Usually you never want to calculate the inverse inside a shader. If you need the inverse of a 4x4 homogeneous matrix you should calculate the inverse in a script and provide it to the shader. If you need the inverse of a pure 3x3 rotation matrix you can just transpose the matrix (though this only works for pure rotation, so no scale or translation involved).

Are you sure that the matrix you need isn’t already provided by Unity?