Shading goes all dark and wonky on nonuniform-scaled SkinnedMeshRenderer

I have a blob model that squelches around the level. It uses a bone to deform its shape (thus, needs a SkinnedMeshRenderer), but it also sometimes flattens down, which I do by scaling the whole model in Y (while leaving X and Z scale alone).

With a MeshRenderer, this looks great (but of course I lose the bone effect). But with a SkinnedMeshRenderer, as I shrink it down in Y, the shading gets severe and wrong.

Here are two versions of the same model, at uniform scale: SMR on the left, MeshRenderer on the right. Both have the same material, and have Cast/Receive Shadows turned off. (Rendered with wireframe + shading so you can see the geometry.)

8702577--1174878--upload_2023-1-3_12-54-26.png

Looks fine! But now here are the same models, at scale <1, 0.3, 1>:
8702577--1174881--upload_2023-1-3_12-55-17.png

The MeshRenderer model (right) looks correct: the shading is reduced, reflecting that what was essentially a hemisphere is now a flat hemisphere. But on the left, the opposite has happened: the shading has gotten more extreme. This is incorrect and sticks out like a sore thumb in my level.

If we go further, say a Y scale of 0.05:
8702577--1174884--upload_2023-1-3_12-56-35.png

Then the MeshFilter one (right) has adopted uniform shading that would blend in with the floor (as it should). The SMR one (left) is now too dark overall, especially on the lower right half, though we seem to have a specular highlight right in the middle.

Note that the actual geometry drawn really is the same in both cases. Here’s a side view:
8702577--1174887--upload_2023-1-3_12-58-14.png

Both flat as a pancake. Yet the SMR one has this extreme shading on it.

What’s going on here, and how can I make it stop? (I need to use SkinnedMeshRenderer for the bone animation, so I can’t just switch to MeshRenderer.) This is using the standard render pipeline, if it matters.

After much searching I found this thread:

…which seems to be the same problem, though I’m not convinced everyone in that thread actually understands what’s going on. Or maybe my issue is different.

In any case it sounds as though, as an optimization, the SkinnedMeshRenderer is not recalculating the normals. Pity that MeshRenderer can do it correctly, but SMR can’t.

The final post in that thread claimed to have found a solution, but I can’t understand it. So, still looking for help.

OK, though I still don’t understand it, I copied that solution myself and it sort of works. The suggestion was to make a shader like this:

And when I apply that to my two test pieces, I get:

8702700--1174911--upload_2023-1-3_13-28-57.png

So now the shading on the SMR (left) looks the same as with MeshRenderer (right), and it’s approximately correct — but it’s no longer smooth, it’s faceted. I think this shader is basically just computing the normal of each point from the way the surface is changing across the fragment, which is going to be constant within each polygon. We need to instead calculate normals at the vertices, and interpolate across the fragment, to get smooth shading.

My guess would be that the skinned mesh shader is not using the inverse-transpose to transform the normals. This is usually the problem you get with non-uniform scaling in that case.

You’ve already analyzed it pretty well. Can’t really add more to it, sorry.

1 Like

I managed to work around it with a custom shader that includes a “fudge factor” parameter used to reduce the X and Z components of the vertex normals (making them more vertical). I apply this in proportion to how much I squash the blob. It looks good enough for our purposes.

But it wouldn’t work in a more general case. It’s amazing to me that SMR doesn’t have an option to adjust the normals when the scale is nonuniform.

1 Like