surface shader vertex program, MVP matrix getting assigned.

In my vertex program of my surface shader, when i do:
v.vertex = mul( UNITY_MATRIX_P, position );

if i look at the compiled code, it turns my code into gl_ModelViewProjectionMatrix * (gl_ProjectionMatrix * position).

I just want the (gl_ProjectionMatrix * position) term. what is the correct equation to pass in to v.vertex, to cancel it out?

(Currently, I edited the compiled source and deleted all instances of “gl_ModelViewProjectionMatrix *” and it does what I’m trying to do, but that’s a dumb hack.)

Just a wild guess:

v.vertex = mul(position, UNITY_MATRIX_IT_MV);

Reason: mul(position, UNITY_MATRIX_IT_MV) multiplies the transpose (because the matrix is the 2nd argument of mul) of the transpose (the “T” in “_IT_MV”) of the inverse (the “I” in “_IT_MV”) of the model-view matrix (the “MV” in “_IT_MV”) to position. The transpose of the transpose of the inverse is just the inverse.

Unity then automatically multiplies the projection times the model-view matrix on it. The model-view matrix and the inverse of the model-view matrix cancel each other out; thus, only the projection matrix remains, which is what you wanted.

I don’t have time to test it, but it makes sense to me. :wink:

this totally worked. i have no idea why it was (or i was thinking it was) adding that extra term to my ’ v.vertex = ’ code. i mighta just been writing something wrong to begin with. /boggle. Thanks Martin.