Get triangles vertices in fragment shader

Hi,

I’m trying to solve a complex geometry issue and I need to do it with a fragment shader since its output will be used as a lookup table.

Basically, each fragment color must be computed using the UV coordinates of the 3 vertices of the triangle in which the fragment lies.
My problem is to get the vertices’ coordinates. In my frag function I need something like this:

float3 frag (v2f i) : SV_Target
{
    
    float3 v1 = getFirstVertexCoord();
    float3 v2 = getSecondVertexCoord();
    float3 v3 = getThirdVertexCoord();
   
    float3 result = <use v1, v2 and v3 to compute fragment color>

    return result;

}

Is there a way to do this, maybe using some vertex/geometry shader output?

Thanks to anyone who will help me,
Francesco

The resulting UV value passed by the vertex shader to the fragment is already the proper interpolated UV between the 3 vertices of that triangle based on the fragment position in that triangle.

Could you perhaps explain what you’re trying to achieve that you think you explicitly need the UV values at each vertex inside your fragment? Usually such calculations can be moved to the vertex program and simply outputted to the fragment.

1 Like

Hi,
Thanks for the answer!

I need those 3 coordinates because I have to create a texture in which each triangle (so, all the pixels inside of it) of the UV map has a specific color. This color (r, g, b) will store the direction vector (x, y, z) of one edge of the triangle, that will be computed using the vertices’ coordinates.

The real goal of my system is to compute the angle between each 3D triangle’s orientation and its corresponding uv mapped triangle’s orientation. To do so, I’d follow this algorithm:

For each fragment

  1. get the triangle in which the fragment lies
  2. get world space vertices v1, v2, v3 of the triangle
  3. compute direction of the edge connecting v1 and v2 like float2 dir3D = (v2-v1).xy;
  4. get uv coordinates uv1, uv2, uv3 of v1, v2 and v3
  5. compute direction of the edge connecting uv1 and uv2 like float3 dir2D = (uv2-uv1);
  6. float3 angle = angle between dir3D and dir2D
  7. fragmentColor = float3(angle.x, angle.y, angle.z) ;

As you can see, I’d actually need only 2 vertices (v3 is never used), but I wrote the algorithm using all the three vertices to let you better understand.

What do you plan to do with this angle? Other approaches may be suggested if we understand the end goal.

UVs do not really have their own orientation, they are a planar mapping of the mesh vertices to locations on that plane. Simply rotating your model in game would alter the result of this calculation. So I’m still not quite sure what kind of data you expect to get out of it and plan to use it.

Here’s an example of what you’ve described to me, though without understanding the context of its usage I’m not entirely sure this is what you’d want either, but it lays out the math you put there, using the geometry program to access the vertex stream.

Shader "Invertex/TriangleCalcExample"
{
    SubShader
    {
        Tags { "RenderType"="Opaque" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #pragma geometry geom
           
            #include "UnityCG.cginc"

            struct v2g
            {
                float2 uv : TEXCOORD0;
                float4 vertex : POSITION;
                float3 worldPos : TEXCOORD1;
            };

            struct g2f
            {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0;
                fixed4 col : COLOR;
            };

            v2g vert (appdata_base v)
            {
                v2g o;
                o.worldPos = mul(unity_ObjectToWorld, v.vertex);
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.texcoord;

                return o;
            }
           
            [maxvertexcount(3)]
            void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
            {
                g2f o;

                float2 edgeA = normalize(IN[1].worldPos.xy - IN[0].worldPos.xy);
                float2 uvEdgeA = normalize(IN[1].uv - IN[0].uv);
                float2 angle = normalize(edgeA - uvEdgeA);

                for (int i = 0; i < 3; i++)
                {
                    o.pos = IN[i].vertex;
                    o.uv = IN[i].uv;
                    o.col = fixed4(angle, 1, 1);
                    tristream.Append(o);
                }
            }

            fixed4 frag (g2f i) : SV_Target
            {
                return  i.col;
            }
            ENDCG
        }
    }
}

GraveAppropriateAmazonparrot

1 Like

@Invertex Thanks for spending your time to help me! I appreciate a lot. :slight_smile:

Actually, the shader you wrote is the kind of thing I needed. There are just a couple of things missing in it.

I try to explain why I have to do this calculations.

I’m dveloping a texture baking tool ( [RELEASED] Total Baker - Texture Baking System ) and I’m struggling with the normals baking, so the goal is to generate a normal map. Everithing works fine except for UV maps whose triangles are rotated compared to the world space ones (reoriented in the xy plane).

- In my scene I instantiate both a lowpoly model and a highpoly model with a fixed position and rotation.
- For each pixel of the output normal map I cast one ray R
- R will have as origin the lowpoly surface’s world point corresponding to the UV coordinate corresponding to the current pixel
- R’s direction will be the normal of the face in which the origin lies.
- The highpoly’s hit point will have a normal (interpolated or not)
- To know what color the normal map’s pixel should have I read the hit point’s normal and I make some calculations to make it relative to the lowpoly’s normal.

Now, imagine to reorient all the 3D triangles to face the forward vector (0,0,1) that is the plane xy.
These triangles may appear rotated compared to the corresponding UV map’s ones. If I write the previously computed normal as it is, it will be wrong, since it doesn’t take into account this rotation. And that’s why I need that angle. My idea is to store these angles in a lookup table that in each pixel stores this angle.
So the output of this shader should be “unwrapped” so that the vertices of the model appear in screen space in their uv positions. I already do something similar to create a texture containing the world space points of the model (everithing is made in the vertex stage) but I should do this after the geometry stage color computations. Is this possible?

This is the shader I use for that:

Shader

Shader "Hidden/UV_to_WorldPos" {
    SubShader{
        Pass{

            Lighting Off
            Cull Off

            CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #include "UnityCG.cginc"

                struct v2f {
                    float4 pos : SV_POSITION;
                    float4 color : COLOR;
                };

                // vertex input: position, UV
                struct appdata {
                    float4 vertex : POSITION;
                    float2 uv : TEXCOORD0;
                };

                v2f vert(appdata v){
                    v2f o;
                    float3 worldPos =  mul(unity_ObjectToWorld, v.vertex);
                    o.color = float4(worldPos.x, worldPos.y, worldPos.z, 1);
                    o.pos = float4(v.uv.x * 2.0 - 1.0, v.uv.y * 2.0 - 1.0, 1.0, 1.0);
                    return o;       
                }

                float4 frag(v2f i) : SV_Target{
                    return i.color;
                }
            ENDCG
        }
    }
}

So, what is missing in the shader is basically the 3D triangles reorientation math, the angle computation and the unwrapping. In C# I’d use Quaternion.FromToRotation(triangleNormal, Vector3.forward) to create the rotation, and then I’d multiply each triangle’s vertex by it. At least I’d use Vector3.SignedAngle between the 3D edge and the 2D edge directions.

I’d rewrite the geometry function like this (yeah, it’s a mix of HLSL and C#, but it’s just to understand the kind of math I need):

[maxvertexcount(3)]
void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
{
    g2f o;
 
    //get vertices
    float3 v0 = IN[0].worldPos;
    float3 v1 = IN[1].worldPos;
    float3 v2 = IN[2].worldPos;
 
    //compute normal
    float3 normal = normalize(cross(v1-v0, v2-v0));  
 
    //create rotation make the normal perpendicular to the xy plane 
    quaternion rot = FromToRotation(normal, float3(0,0,1));

    //reorient triangle so it becomes perpendicular to the xy plane
    v0 = rot*v0;
    v1 = rot*v1;
    v2 = rot*v2;
 
    //get 3D edge direction
    float2 edgeA = normalize(v1.xy - v0.xy);
 
    //get UV edge direction
    float2 uvEdgeA = normalize(IN[1].uv - IN[0].uv);
 
    //compute angle between edges
    float2 angle = GetAngle(edgeA, uvEdgeA);

    //color vertices based on the angle
    for (int i = 0; i < 3; i++)
    {
        o.pos = IN[i].vertex;
        o.uv = IN[i].uv;
        o.col = fixed4(angle, 1, 1);
        tristream.Append(o);
    }
}

The angle computation shouldn’t be hard to implement, but I don’t know how to deal with FromToRotation because in shaders we don’t have quaternions. Do you know how to do it?

I’m not sure how you’d rotate the triangle in a way that was properly consistent and useful. You can create a 3x3 matrix to mul() with and rotate the triangle around its center, but this is going to give inconsistent results depending on the triangle’s angle from the camera and winding orders of the vertices.

Would unfolding to 0-1 planar on world space not be what you need? That way the mesh is basically the same layout as the UVs and can be sampled on the screen that way.

Here I am unfolding and outputting the worldNormal from before it was unfolded. That same could be done with world position or any other value.

Shader "Invertex/UVLayout"
{
    Properties
    {
        _Unwrap("Unwrap", Range(0, 1)) = 1
    }
    SubShader
    {
        Tags{ "RenderType" = "Opaque" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #pragma geometry geom

            #include "UnityCG.cginc"
            float _Unwrap;

            struct v2g
            {
                float2 uv : TEXCOORD0;
                float4 vertex : POSITION;
                float3 worldPos : TEXCOORD1;
                float3 worldNorm : NORMAL;
            };

            v2g vert(appdata_full v)
            {
                v2g o;
                o.worldPos = mul(unity_ObjectToWorld, v.vertex);
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.texcoord;
                o.worldNorm = UnityObjectToWorldNormal(v.normal);
                return o;
            }

            struct g2f
            {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0;
                fixed4 col : COLOR;
            };

            [maxvertexcount(3)]
            void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
            {
                g2f o;

                for (int i = 0; i < 3; i++)
                {
                    float3 worldPos = lerp(IN[i].worldPos, float3(IN[i].uv, 0), _Unwrap);
                    float4 objectPos = mul(unity_WorldToObject, float4(worldPos, 1));
                    o.pos = UnityObjectToClipPos(objectPos);
                    o.uv = IN[i].uv;
                    o.col = float4(IN[i].worldNorm, 1);
                    tristream.Append(o);
                }
            }

            fixed4 frag(g2f i) : SV_Target
            {
                return  i.col;
            }
            ENDCG
        }
    }
}
2 Likes

Hey man you’re the best!
this Is very cool

fra3point I tried to fix your code but It seems somethings Is wrong!

also I changed the GetAngle() I think it has problem

    Shader "Invertex/Test"
    {
        SubShader
        {
            Tags { "RenderType"="Opaque" }
  
                Cull Off
            Pass
            {
                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #pragma geometry geom
            
                #include "UnityCG.cginc"
  
                struct v2g
                {
                    float2 uv : TEXCOORD0;
                    float4 vertex : POSITION;
                    float3 worldPos : TEXCOORD1;
                };
  
                struct g2f
                {
                    float4 pos : SV_POSITION;
                    float2 uv : TEXCOORD0;
                    fixed4 col : COLOR;
                };
  
                v2g vert (appdata_base v)
                {
                    v2g o;
                    o.worldPos = mul(unity_ObjectToWorld, v.vertex);
                    o.vertex = UnityObjectToClipPos(v.vertex);
                    o.uv = v.texcoord;
  
                    return o;
                }

                float3x3 rotationAlign( const float3 d, const float3 z )
                {
                    const float3  v = cross( z, d );
                    const float c = dot( z, d );
                    const float k = 1.0f/(1.0f+c);

                    return float3x3( v.x*v.x*k + c,     v.y*v.x*k - v.z,    v.z*v.x*k + v.y,
                                v.x*v.y*k + v.z,   v.y*v.y*k + c,      v.z*v.y*k - v.x,
                                v.x*v.z*k - v.y,   v.y*v.z*k + v.x,    v.z*v.z*k + c    );
                }


            
                [maxvertexcount(3)]
                void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
                {
               
                    g2f o;
             
                    //get vertices
                    float3 v0 = IN[0].worldPos;
                    float3 v1 = IN[1].worldPos;
                    float3 v2 = IN[2].worldPos;
             
                    //compute normal
                    float3 normal = normalize(cross(v1-v0, v2-v0));
             
                    //create rotation make the normal perpendicular to the xy plane
                    float3x3 rot = rotationAlign(normal, float3(0,0,1));
             
                    //reorient triangle so it becomes perpendicular to the xy plane
                    v0 = mul(rot,v0);
                    v1 = mul(rot,v1);
                    v2 = mul(rot,v2);
             
                    //get 3D edge direction
                    float2 edgeA = normalize(IN[1].worldPos.xy - IN[0].worldPos.xy);
             
                    //get UV edge direction
                    float2 uvEdgeA = normalize(IN[1].uv - IN[0].uv);
             
                    //compute angle between edges
                    float2 angle = acos( dot(edgeA, uvEdgeA));
             
                    //color vertices based on the angle
                    for (int i = 0; i < 3; i++)
                    {
                        o.pos = IN[i].vertex;
                        o.uv = IN[i].uv;
                        o.col = fixed4(angle, 1, 1);
                        tristream.Append(o);
                    }
                }
  
                fixed4 frag (g2f i) : SV_Target
                {
                    return  i.col;
                }
                ENDCG
            }
        }
    }

3495549--278493--awdawd.PNG

1 Like

I found a HLSL implementation of quaternions that mey be useful, I’ll try to deal with it for the rotation calculations.

The unfolding method is working well, I just adapted it a bit. Here is it:

Shader

Shader "Invertex/UVLayout"
{
    Properties
    {
        _Unwrap("Unwrap", Range(0, 1)) = 1
    }
    SubShader
    {
        Tags{ "RenderType" = "Opaque" }

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #pragma geometry geom

            #include "UnityCG.cginc"
            #include "./Quaternion.cginc" 
            float _Unwrap;

            struct v2g
            {
                float2 uv : TEXCOORD0;
                float4 vertex : POSITION;
                float3 worldPos : TEXCOORD1;
                float3 worldNorm : NORMAL;
            };

            v2g vert(appdata_full v)
            {
                v2g o;
                o.worldPos = mul(unity_ObjectToWorld, v.vertex);
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.texcoord;
                o.worldNorm = UnityObjectToWorldNormal(v.normal);
                return o;
            }

            struct g2f
            {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0;
                fixed4 col : COLOR;
            };


            float angle(float2 v1, float2 v2) {
                return acos(dot(v1, v2)/(length(v1)*length(v2)));
            }
            
             float signed_angle(float2 vec1, float2 vec2)
             {
                 float2 vec1Rotated90 = float2(-vec1.y, vec1.x);
                 float sign = (dot(vec1Rotated90, vec2) < 0) ? -1.0f : 1.0f;
                 return angle(vec1, vec2) * sign;
             }
             
            [maxvertexcount(3)]
            void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
            {
                g2f o;

                for (int i = 0; i < 3; i++)
                {
                    float3 worldPos = float3(IN[i].uv - 0.5, 0);
                    float4 objectPos = mul(unity_WorldToObject, float4(worldPos, 1));
                    o.pos = UnityObjectToClipPos(objectPos);
                    o.uv = IN[i].uv;
                    
                    
                    //get vertices
                    float3 v0 = IN[0].worldPos; 
                    float3 v1 = IN[1].worldPos; 
                    float3 v2 = IN[2].worldPos;  
                    
                    //compute normal
                    float3 normal = normalize(cross(v1-v0, v2-v0));        
                    
                    //create rotation make the normal perpendicular to the xy plane       
                    float4 rot = from_to_rotation(normal, float3(0,0,1));  
     
                    //reorient triangle so it becomes perpendicular to the xy plane    
                    v0 = rotate_vector(v0,rot);
                    v1 = rotate_vector(v1,rot);
                    v2 = rotate_vector(v2,rot);
                    
                    //get 3D edge direction     
                    float2 edgeA = normalize(v1.xy - v0.xy);
                    
                    //get UV edge direction   
                    float2 uvEdgeA = normalize(IN[1].uv - IN[0].uv);
                    
                    //compute angle between edges
                    float a = signed_angle(edgeA, uvEdgeA);
     
                    //create channel in from range [-180°,180°] to range [0,1]
                    float c = ((a/180.0)*0.5)+0.5;
                    
                    //apply color
                    o.col = float4(c,c,c,1);
                    tristream.Append(o);
                }
            }

            fixed4 frag(g2f i) : SV_Target
            {
                return  i.col;
            }
            ENDCG
        }
    }
}

I didn’t solve my initial problem but I’m one step closer to the solution. Maybe the angle algorithm I’m trying to use is not correct. I’ll do some tests and I’ll update this thread if have news.

For the moment, thank you very much for your precious help!

Hi!
Thanks for your answer.

The angle function should be signed in the range [-180°, 180°], so it would become something like

float signed_angle(float2 v1, float2 v2) {
    float radAngle = atan2(v1.y,v1.x) - atan2(v2.y,v2.x);
    return radAngle*57.2957795129; //57.2957795129 is 180/pi (rad to deg conversion)
}

and for the vertices rotation I’m using a quaternion implementation (https://gist.github.com/mattatz/40a91588d5fb38240403f198a938a593)

I try with your solution!

Hi,

Here’s the shader I could write:

Shader

Shader "Custom/RotationMap"
{

    Properties{
        _Unwrap("Unwrap", Range(0, 1)) = 1
    }

    SubShader
    {
        Tags{ "RenderType" = "Opaque" }
        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #pragma geometry geom
            #include "UnityCG.cginc"
                                   
            static const float RAD2DEG = 57.295779513;
            static const float DEG2RAD = 0.01745329252;
       
            float _Unwrap;       
            struct v2g
            {
                float2 uv : TEXCOORD0;
                float4 vertex : POSITION;
                float3 worldPos : TEXCOORD1;
            };
            v2g vert(appdata_full v)
            {
                v2g o;
                o.worldPos = mul(unity_ObjectToWorld, v.vertex);
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = v.texcoord;
                return o;
            }
            struct g2f
            {
                float4 pos : SV_POSITION;
                float2 uv : TEXCOORD0;
                fixed4 col : COLOR;
            };
                 
            //CLOCKWISE: NEGATIVE
            //COUNTER CLOCKWISE: POSITIVE 
            float signed_angle(float2 a, float2 b) {
                a = normalize(a);
                b = normalize(b);           
                return atan2( a.x*b.y - a.y*b.x, a.x*b.x + a.y*b.y );
            }
           
         
            float3 RotateAroundX (float3 v, float angle)
            {
                return float3(
                    v.x,
                    cos(angle)*v.y - sin(angle)*v.z,
                    sin(angle)*v.y + cos(angle)*v.z
                );
            }
       
             float3 RotateAroundY (float3 v, float angle)
            {
                return float3(
                    cos(angle)*v.x + sin(angle)*v.z,
                    v.y,
                    -sin(angle)*v.x + cos(angle)*v.z
                );
            }
       
            float3 RotateAroundZ (float3 v, float angle)
            {
                return float3(
                    cos(angle)*v.x - sin(angle)*v.y,
                    sin(angle)*v.x + cos(angle)*v.y,
                    v.z
                );
            }   
         
         
         
            [maxvertexcount(3)]
            void geom(triangle v2g IN[3], inout TriangleStream<g2f> tristream)
            {
                g2f o;                                       
           
                //get vertices
                float3 v0 = IN[0].worldPos;
                float3 v1 = IN[1].worldPos;
                float3 v2 = IN[2].worldPos;
       
                float3 v00 = v0, v11 = v1, v22 = v2;
                float colorv0, colorv1, colorv2;
           
           
                //reorient triangle so that its normal becomes perpendicular to the xy
           
                //compute normal
                float3 n = normalize(cross(v1-v0, v2-v0));               
                //angle needed to align vector's y angle  to the y axis
                float yAngle = (atan2(n.z, n.x) - atan2(1, 0)); 
           
                //Rotate vertices around Y axis
                v00 = RotateAroundY(v00, yAngle);
                v11 = RotateAroundY(v11, yAngle);
                v22 = RotateAroundY(v22, yAngle);
           
                //Recompute normal after first rotation
                n = normalize(cross(v11-v00, v22-v00));           
                //angle needed to align vector's x angle to the x axis
                float xAngle = (atan2(n.y, n.z) - atan2(0, 1)); 

                //Rotate vertices around X axis
                v00 = RotateAroundX(v00, xAngle);
                v11 = RotateAroundX(v11, xAngle);
                v22 = RotateAroundX(v22, xAngle);
               
           
           
                float2 edge, uvEdge;
                float anglev0, anglev1, anglev2, angleEdge1, angleEdge2;
           
                ///// Angle of vertex v0 /////
                edge = normalize(v11 - v00);                                               //get 3D direction of the edge v0v1                             
                uvEdge = float2(IN[1].uv.x, IN[1].uv.y) - float2(IN[0].uv.x, IN[0].uv.y);  //get UV edge direction 
                angleEdge1 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges           
                edge = (v22 - v00);                                                        //get 3D direction of the edge v0v2
                uvEdge = float2(IN[2].uv.x, IN[2].uv.y) - float2(IN[0].uv.x, IN[0].uv.y);  //get UV edge direction         
                angleEdge2 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges         
                anglev0 = (angleEdge1+angleEdge2)*0.5;                                     //vertex angle can be found by averaging its touching edges' angles
                colorv0 = ((anglev0/PI)*0.5)+0.5;                                          //remap channel from range [-PI,PI] to range [0,1]
             
                ///// Angle of vertex v1 /////           
                edge = normalize(v22 - v11);                                               //get 3D direction of the edge v1v2                             
                uvEdge = float2(IN[2].uv.x, IN[2].uv.y) - float2(IN[1].uv.x, IN[1].uv.y);  //get UV edge direction 
                angleEdge1 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges           
                edge = (v00 - v11);                                                        //get 3D direction of the edge v1v0
                uvEdge = float2(IN[0].uv.x, IN[0].uv.y) - float2(IN[1].uv.x, IN[1].uv.y);  //get UV edge direction         
                angleEdge2 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges         
                anglev1 = (angleEdge1+angleEdge2)*0.5;                                     //vertex angle can be found by averaging its touching edges' angles
                colorv1 = ((anglev1/PI)*0.5)+0.5;                                          //remap channel from range [-PI,PI] to range [0,1]
           
                ///// Angle of vertex v2 /////           
                edge = normalize(v11 - v22);                                               //get 3D direction of the edge v2v1                             
                uvEdge = float2(IN[1].uv.x, IN[1].uv.y) - float2(IN[2].uv.x, IN[2].uv.y);  //get UV edge direction 
                angleEdge1 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges           
                edge = (v00 - v22);                                                        //get 3D direction of the edge v2v0
                uvEdge = float2(IN[0].uv.x, IN[0].uv.y) - float2(IN[2].uv.x, IN[2].uv.y);  //get UV edge direction         
                angleEdge2 = signed_angle(edge, uvEdge);                                   //compute angle between 3D and UV edges         
                anglev2 = (angleEdge1+angleEdge2)*0.5;                                     //vertex angle can be found by averaging its touching edges' angles
                colorv2 = ((anglev2/PI)*0.5)+0.5;                                          //remap channel from range [-PI,PI] to range [0,1] 
               
                float3 worldPos, objectPos;
                                                       
                worldPos = lerp(IN[0].worldPos, float3(IN[0].uv, 0), _Unwrap);
                objectPos = mul(unity_WorldToObject, float4(worldPos, 1));
                o.pos = UnityObjectToClipPos(objectPos);
                o.uv = IN[0].uv;
                o.col = float4(colorv0,colorv0,colorv0,1);
                tristream.Append(o);
           
                worldPos = lerp(IN[1].worldPos, float3(IN[1].uv, 0), _Unwrap);
                objectPos = mul(unity_WorldToObject, float4(worldPos, 1));
                o.pos = UnityObjectToClipPos(objectPos);
                o.uv = IN[1].uv;
                o.col = float4(colorv1,colorv1,colorv1,1);
                tristream.Append(o);
           
                worldPos = lerp(IN[2].worldPos, float3(IN[2].uv, 0), _Unwrap);
                objectPos = mul(unity_WorldToObject, float4(worldPos, 1));
                o.pos = UnityObjectToClipPos(objectPos);
                o.uv = IN[2].uv;
                o.col = float4(colorv2,colorv2,colorv2,1);
                tristream.Append(o);             
           
           
            }

            fixed4 frag(g2f i) : SV_Target
            {
                return  i.col;
            }
            ENDCG
        }
    }
}

I want to recall you that the problem I’m trying to solve is to find the Z rotation to apply to a normal vector stored in a Normal Map in order to make it face the right direction when the UV map’s triangles are rotated compared to their corresponding 3D triangles.

So, this shader takes all the triangles of a model, it rotates them to be parallel to the XY plane and it (badly) computes the angle needed to rotate the normal vector stored in each pixel in to follow the UV triangle rotation.
The output color [0,1] of each pixel represents the angle in the range [-180°,180°].

Initially, I thought that I could find the wanted angle for each triange in this way:

  • In the geometry shader, get 3 vertices at a time, so get a triangle

  • Rotate the triangle to be in the XY plane (like an UV map)

  • Choose one edge of the triangle

  • Compute angle between this edge’s direction and the corresponding UV map’s esge’s direction.

  • Create the color based on this angle
    Apply the same color on all the 3 vertices

  • Output the colored triangle

But I realized that one edge is not enough to get a complete information about the rotation the normal vector needs to follow the triangle rotation.
So now I’m trying to compute this angle by vertex, assigning a posibily different color to the 3 vertices. The procedure is now something like this:

  • In the geometry shader, get 3 vertices at a time, so get a triangle

  • Rotate the triangle to be in the XY plane (like an UV map)

  • For each vertex V

  • Get the two edges E1 and E2 that touch V

  • A1 = angle between E1 and its corresponding edge in the UV map

  • A2 = angle between E2 and itscorresponding edge in the UV map

  • Compute A = (A1+A2) / 2

  • Create the color based on this angle

  • Apply this color to V

  • Output the interpolated and colored triangle

I can’t really test if the produced values are good, but I think they aren’t because my baking tool doesn’t produce good results when using these angles to rotate normals.

Any ideas on how to correctly compute these angles?
My brain is exploding :frowning:

Thanks,
Francesco

It is quite interesting that we can get triangle vertices directly from vertex to pixel shader without geometry shader:

//https://github.com/przemyslawzaworski/Unity3D-CG-programming


Shader "No interpolation"
{
    Subshader
    {
        Pass
        {
            CGPROGRAM
            #pragma vertex vertex_shader
            #pragma fragment pixel_shader
            #pragma target 4.0
                           
            struct SHADERDATA
            {
                linear float4 Vertex : SV_POSITION;
                nointerpolation float4 Point : TEXCOORD0;
            };
       
            SHADERDATA vertex_shader (float4 vertex:POSITION)
            {
                SHADERDATA vs;
                vs.Vertex = UnityObjectToClipPos (vertex);
                vs.Point = vertex;
                return vs;
            }

            void pixel_shader (in SHADERDATA ps, out float4 result : SV_Target0 )
            {
                 result = ps.Point;
            }

            ENDCG
        }
    }
}

We can use Interpolation Modifiers, more info:

2 Likes

It’s not quite the same thing, since geometry shader lets you process all 3 (or 4) vertices together at once, doing math with all three. This is simply preventing the value from being interpolated in the pixel shader, it doesn’t give you triangle access in the vertex shader, which is what a geometry shader is basically giving you.

Hi, @Przemyslaw_Zaworski ! Thanks for your reply.

As @Invertex said, the Interpolation modifier just makes the shader not interpolate vertices.

What I need is a geometry shader in which I can access 3 vertices at once and I can compute some math about the edges of the triangle defined by those three vertices. Then, I need interpolation because I compute an angle for each vertex and output it as a vertex color. In this way I have all the intermediate points’ color by interpolation.

However… My nightmare is still there. I really can’t figure out the right way to rotate normals when the UV triangles are rotated compared to the 3D triangles. :frowning:

I think this is the key problem you’re running into. Your making the assumption that all you need is a rotation around the tangent space z, ie: the surface normal, but in all likelihood the surface normal will not match between the two meshes so there may be no solution with only a “z” rotation.

You need a full rotation matrix.

You need none of this. Not trying to be rude, just being blunt, but this exposes your lack of understanding of how tangent space normal maps work. When dealing with tangent space normal maps in a normal shader you’re passing the normal, tangent, and bitangent (sometimes called binormal) from the vertex to the fragment. Those three vectors together form the 3x3 rotation matrix used to transform from tangent space to world space. In other words that’s half of the rotation you need! If you want to output that from a shader to a texture, you could look into converting a rotation matrix into a quaternion so it can be packed into a single ARGBHalf. To transfer the tangent space normal map from one mesh to another you just need to transform it from tangent space into world space using that tangent to world rotation, then apply the new mesh’s world to tangent matrix. Really you could just output the original mesh’s normals in world space rather than trying to output a rotation.

Also, if you want to convert from world to tangent space, just pass the same tangent, bitangent, and normal you would for doing tangent to world and apply the matrix to the vector in the “wrong” order.

That is instead of the usual:
half3 worldNormal = mul(tangentToWorld, tangentNormal);

Use:
half3 tangentNormal = mul(worldNormal, tangentToWorld);

2 Likes

Thanks for your answer, @bgolus !

I admit my leak of knowledge.
So if I understood what you said, using my raycast system that finds corresponding points between a lowpoly mesh L and highpoly mesh H, I can transfer normals from H to L in the following way:

-For each pixel (x,y) of the wanted Normal Map
-Cast a ray from L to H
-Get H’s normal N in the hit point (this will be a world space normal)
-Write N in the pixel(x,y) of the output texture
-Save the map

Now that I have a world space normal map for L, I can use a shader that transforms all its normals from world space to tangent space using a tangent to world rotation matrix like this:

World2Tangent

Shader "Custom/World2Tangent" {
   Properties {
       _WorldNormalMap ("World Normal Map", 2D) = "bump" {}
   }
   SubShader
   {
       Pass
       {
           Cull Off
           CGPROGRAM
           #pragma vertex vert
           #pragma fragment frag       
           #include "UnityCG.cginc"

           
           struct appdata
           {
               float4 vertex : POSITION;
               float2 texcoord : TEXCOORD0;
               float3 normal : NORMAL;
               float4 tangent : TANGENT;
           };

           
           struct v2f
           {
               float4 pos : SV_POSITION;
               float2 texcoord : TEXCOORD0;
               half3x3 tangentToWorld : TEXCOORD1;
           };

           
           sampler2D _WorldNormalMap;
           
           
           v2f vert (appdata v)
           {
               v2f o;
               o.pos = UnityObjectToClipPos(v.vertex);
               o.texcoord = v.texcoord;   
               
               half3 worldNormal = UnityObjectToWorldNormal(v.normal);
               half3 worldTangent = UnityObjectToWorldDir(v.tangent.xyz);
               half3 worldBitangent = cross(worldNormal, worldTangent);
               
               half3 tangentToWorld = half3x3(worldNormal, worldTangent, worldBitangent);
               
               return o;
           }

           
           half4 frag (v2f i) : SV_Target
           {
               half3 worldNormal = tex2D(_WorldNormalMap, i.texcoord);
               half3 tangentNormal = half4(mul(worldNormal, i.tangentToWorld), 1);
               return tangentNormal;
           }
           
           
           ENDCG
       }
   }
}

Is that right?

Thanks for the support!

Close.

That’s a making a transpose rotation matrix, so it’s actually already the world to tangent matrix and there’s no reason to flip the mul order.

You need to multiply the bitangent by the v.tangent.w and the unity_WorldTransformParams.w to account for flipped UVs.

You need to actually pass the matrix to the fragment.

1 Like

I’ve made the suggested changes:

Shader

    Shader "Custom/World2Tangent" {
        Properties {
            _WorldNormalMap ("World Normal", 2D) = "bump" {}
        }
        SubShader
        {
            LOD 100
  
            Pass
            {
                Cull Off
  
                CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
          
                #include "UnityCG.cginc"
  
                struct appdata
                {
                    float4 vertex : POSITION;
                    float2 texcoord : TEXCOORD0;
                    float3 normal : NORMAL;
                    float4 tangent : TANGENT;
                };
  
                struct v2f
                {
                    float4 pos : SV_POSITION;
                    float2 texcoord : TEXCOORD0;
                    half3x3 worldToTangent : TEXCOORD1;
                };
  
                sampler2D _WorldNormalMap;
          
                v2f vert (appdata v)
                {
                    v2f o;
                 
                    //use uv space to output results on the screen
                    o.pos = float4(v.texcoord.x*2-1, -(v.texcoord.y*2-1), 0.5, 1.0);
                    o.texcoord = v.texcoord;
  
                    half3 worldNormal = v.normal;
                    half3 worldTangent = v.tangent;
                    half3 worldBitangent = cross(worldNormal, worldTangent) * v.tangent.w * unity_WorldTransformParams.w;
                 
                    //build the worldToTangent rotation matrix and output it
                    o.worldToTangent = half3x3(worldNormal, worldTangent, worldBitangent);
  
                    return o;
                }
           
                half3 frag (v2f i) : SV_Target
                {
                    half3 worldNormal = tex2D(_WorldNormalMap, i.texcoord);
                    half3 tangentNormal = mul(i.worldToTangent, worldNormal);
                    return tangentNormal;
                }
             
                ENDCG
            }
        }
    }

Then I’ve baked world space normals from a highpoly model to lowpoly model on this texture:

3514117--280740--upload_2018-5-29_20-7-21.png

And this is the result of the shader applied on the original lowpoly model:

3514117--280742--upload_2018-5-29_20-13-20.png

It doesn’t seem a good tangent space normal map. What am I missing?

Thanks!

You’re reading a texture that appears to be a packed normal (-1 to 1 range remapped to 0 to 1.0) and not unpacking it, then writing out an unpacked normal. Basically you’re transforming an unnormalized vector that points in one general direction (positive xyz) and then clipping the results.

I would have expected the world normal texture to be ARGBFloat or ARGBHalf.