Transform vertices via ModelViewMatrix of another GameObject

Hi all,

I’m trying to create a vertex shader which renders the current model transformed(position, rotation, scale) in the space of another GameObject.
I know this can be solved by script, but I wanted to do it in a shader for learning purposes and later enhancement of the functionality.

I tried some stuff and searched the forum, but have still problems understanding how those matrices work.
At last I just tried to multiply Unity’s model view with the projection matrix - but not even this worked(No vertices are displayed) for the model.

  Shader "Example/Normal Extrusion" {
    Properties {
      _MainTex ("Texture", 2D) = "white" {}  

    }
    SubShader {
      Tags { "RenderType" = "Opaque" }
      
      Pass {
      CGPROGRAM
// Upgrade NOTE: excluded shader from OpenGL ES 2.0 because it does not contain a surface program or both vertex and fragment programs.
#pragma exclude_renderers gles

#pragma vertex vert
      #include "UnityCG.cginc"
      
    struct v2f {
        float4 pos : SV_POSITION;
        float4 color : COLOR;

    };
    uniform float4x4 _Global;
      
      v2f vert (appdata_base v) {
        v2f o;
        o.pos = mul ( UNITY_MATRIX_MV * UNITY_MATRIX_P, v.vertex );
        //o.pos =  mul ( UNITY_MATRIX_MVP, v.vertex );
        o.color.xyz = v.normal * 0.5 + 0.5;
        o.color.x = 0.5;

        return o;

      }

      ENDCG
    } 
    }
  }

Maybe one could give me a hint or code example how to achieve this? Thank you!

I haven’t tested this, but try something like:

In a script write:

float4x4 myObjMatrix;

void Update(){

        myObjMatrix= Matrix4x4.TRS (pos : Vector3, q : Quaternion, s : Vector3);

        Shader.SetGlobalMatrix("myObjMatrix",myObjMatrix);
    }

Then in your shader create a variable:

float4x4 myObjMatrix;

and mulitply it by your vertex in the vertex shader.

o.pos = mul(v.vertex,myObjMatrix);

Your matrices are in the wrong order. Matrices are left-multiplied onto vertices, so you need to multiply P * V * M to get the MVP matrix. The MV matrix you are using will already be multiplied in the correct order (V * M), so all you need to do is switch the order of the multiplication you’re doing.

Thank you for your answers.

@reissgrant
Yeah, that was exactly my first thought, but it did not work. I guess just transforming into global world space does not do the thing as the projection matrix has to be merged into this too…

@Daniel
I tried both orders, but no result on screen… Maybe there is another operation involved instead of just multiplying MV with P to get the MVP matrix?

If I use the commented line in the shader(see first post) the object is displayed fine.
Any help is appreciated.

No one an idea?

Apparently * multiplication differs from mul(). However mul ( UNITY_MATRIX_MV, UNITY_MATRIX_P )(or vice versa) still does not result in UNITY_MATRIX_MVP…

Anyone with a solution?

Thank you.

Hi,

I have still no clue on what I have to do to calculate the ModelViewProjection matrix from ModelView- and Projection-Matrix…

No one with an idea?

Thx.

Hi,

I’ve just experienced the same issue, and i found UNITY_MATRIX_P is a zero matrix with D3D but works fine with OpenGL ( i’ve reported the bug : 396072)

I guess you are on windows, so If you force unity to run with openGL ( command line -force-opengl ) you’ll see that :

#pragma vertex testVert

void testVert(float4 pos : SV_POSITION, out half4 oColor : COLOR, out half4 oPos : SV_POSITION )
{
	float4x4 MVP = mul (UNITY_MATRIX_P, UNITY_MATRIX_MV); //doesn't give UNITY_MATRIX_MVP on D3D
	oPos = mul (MVP, pos);
	oColor = 1 ;
	
}

is the same as :

#pragma vertex testVert

void testVert(float4 pos : SV_POSITION, out half4 oColor : COLOR, out half4 oPos : SV_POSITION )
{
	oPos = mul (UNITY_MATRIX_MVP, pos);
	oColor = 1 ;
}

Hope this will help