Is there a way to get the animation transform for vertices of a skinned mesh?

I'm working on a shader for silhouette rendering (details here). I've got it working on meshes that don't have an armature, ie FBXs that import as MeshFilter, not SkinnedMeshRenderer.

Silhouettes on animated models render as nonsense - and I'm pretty sure it's because I need to transform the surface normals I have packed into extra UV channels to match the animated vertex position. I know the vertex position arrives in the shader already transformed per the animation state. Is there anyway to get the matrix used to do that, so I can apply it to normals I've sent along?

Nope. The common hack is to store the normals as, well, normals. Or to pack them into the mesh’s tangent (at the cost of being able to use tangent space normal maps).

The other solution I saw was to store the packed smooth normals themselves in the tangent space for that vertex. Then you can unpack them and transform them into the correctly transformed space. Someone posted this technique on the forum a few months ago, but I can’t find the thread.

Well, that's unfortunate. There's 3 normals per vertex (2 for adjacent faces, 1 for a smoothed vertex normal) I'm sending for the silhouette pass, beyond the ...normal... normal (may be smoothed or hard, it's whatever got imported with the mesh) that gets used for the usual render pass. I'm not sure what you mean by 'pack' them into the tangent, but I'm guessing I'm not going to get all 3 normal values in there, in a way that's going to transform correctly, but maybe my assumption is way off. I only dabble in graphics/shader programming so I have limited idea of what's possible or not.

Searching for the link to the article I posted above I stumbled upon some other stuff that uses the geometry shader to generate all the data at runtime that I'm currently pre-computing via an AssetPostprocessor. Assuming Unity gives has a way to write geometry shaders, and the vertex and normal already have their animated values at that point in the pipeline ... that might be a viable alternative.

Thanks for the reply though! Definitely learning a lot from this project.

Geometry shader implementations you're going to see are all using geometry shader's adjacency data. Unity does not support adjacency data in the geometry shader, nor does any other real time game engine FYI. It's purely a thing that academics play with that sees no real world usage. You're going to have to stick to your asset postprocessor.

For a single extra normal, you just set the tangent vector to the normal you want to use. But for 3 extra normals you'll want to use the tangent space trick I mentioned. That's really the only one that'll work with Unity's skinned meshes. In script you'll want to take the normal and tangent of each vertex you're encoding into, use those to create a matrix, get it's inverse, and then transform your normal into that space using the inverse of the matrix. Something like this:
C# code:

// extract the normal & tangent from the vertex data, calculate the bitangent
Vector3 normal = mesh.normals[currentIndex];
Vector4 tangentAndBitangentSign = mesh.tangents[currentIndex];
Vector3 tangent = tangentAndBitangentSign; // implicit cast to Vector3
Vector3 bitangent = Vector3.Cross(normal, tangent).normalize * tangentAndBitangentSign.w;

// construct the tangent to object space matrix, and its inverse
Matrix4x4 tangentToObject = new Matrix4x4();
objectToTangent.SetRow(0, tangent); // note, these might need to be SetColumn(). I forget which one is correct
objectToTangent.SetRow(1, bitangent);
objectToTangent.SetRow(2, normal);
Matrix4x4 objectToTangent = tangentToObject.inverse;

// apply the object to tangent space transform to the object space normals and store them
mesh.SetUV(2, objectToTangent.MultiplyVector(adjacentNormalA));
mesh.SetUV(3, objectToTangent.MultiplyVector(adjacentNormalB));
mesh.SetUV(4, objectToTangent.MultiplyVector(smoothedNormal));

Shader code:

float3 normal = v.normal.xyz;
float3 tangent = v.tangent.xyz;
float3 bitangent = normalize(cross(normal, tangent)) * v.tangent.w;
float3x3 tangentToObject = float3x3(tangent, bitangent, normal);
float3 objectSpaceSmoothedNormal = mul(tangentToObject, v.texcoord4.xyx); // or wherever you've packed the smooth normal
1 Like

Well, that certainly saved me a ton of time digging into the geometry shader stuff. I’ll go give this a shot. Thanks so much for taking the time to write that all out.

Got this all setup and working. Just an FYI for anyone else who may run into the thread - the matrices in the code above are reversed from what they need to be.

On the shader side, you can do

    TANGENT_SPACE_ROTATION;
    float3x3 tangentToObject = transpose(rotation);

to get your tangent space to object space matrix.

On the C# side, just leave out the inverse step.

bgolus, again, thanks so much.

They’re transposed, not reversed / inverse. That was the comment about the c# should maybe be SetColumn instead of SetRow. The transpose and inverse of an orthogonal matrix (all axis are perpendicular each other) is the same thing, but a tangent space matrix isn’t guaranteed to be orthogonal. It’s totally valid for the normal and tangent to be at an odd angle from each other, though things do go kind of bad if they’re parallel…

However you’re absolutely correct the shader should be using the transposed matrix. Really the above c# should be changed to this:

objectToTangent.SetColumn(0, tangent); // note, these might need to be SetColumn(). I forget which one is correct
objectToTangent.SetColumn(1, bitangent);
objectToTangent.SetColumn(2, normal);
Matrix4x4 objectToTangent = tangentToObject.inverse;

And the shader you can just do this:

float3 objectSpaceSmoothedNormal = mul(tangentSmoothedNormal, tangentToObject);

Flipping the order of the mul automatically applies the matrix transposed.

Random tidbit, the TANGENT_SPACE_ROTATION macro makes a transposed tangent to object space matrix, and gets used by Unity to convert the view or light direction into tangent space. It’s also often wrong for the fore mentioned reason of the transpose not being the same as the inverse, but not wrong enough for anyone to notice for the last decade…