I have a skinned mesh for a hard edged character (think minecraft or crossy road). My problem is that the shading/normals are incorrect in the deformed state. Illustrated below:
Think of this geometry as a “knee”:
I have searched the web, and people have mentioned this problem in the past, but no solutions. Does anybody have a solution or workaround for this?
using UnityEngine;
using System.Collections;
public class FixDeformations:MonoBehaviour
{
public SkinnedMeshRenderer skinnedMeshRenderer;
public MeshFilter bakedMeshFilter;
Mesh bakedMesh;
void Awake()
{
bakedMesh = bakedMeshFilter.mesh;
}
void Update()
{
skinnedMeshRenderer.BakeMesh(bakedMesh);
bakedMesh.RecalculateNormals();
}
}
I figured out a solution to the bone rig deformation problem. Basically, you need to create a bakedMesh from the skinned mesh every frame and recalculate normals on the bakedMesh. Set the skinnedMesh to be ignored by the camera so the bakedMesh is the only one seen.
I hope there is a more efficient way to get this done, but I suspect not.
Just curious - what/why are you morphing on such as low resolution character?
Would regular bones be a better solution?
My experience with morphs / blend shapes has been largely with organic fleshy characters, but a low poly character (like minecraft or crossy road) could almost be considered an inorganic animated character, like a robot or mec.
Genuinely interested in your response.
Actually I am using bones. Blendshapes were a second choice I was considering if I could get the normals to work with them. But the solution I posted above is for bones.
This is a total shot in the dark, but have you tried splitting the geometry in the knee? How does that look after being deformed by the bones. Any difference?
Yep, I tried that. As long a polygon is deformed its normals are not updated correctly. Doesn’t matter whether it is connected to another polygon or not.
You probably want to use actual normals rather than geometry normals for lighting calculation. What is your shader? Tangent space normals should solve this issue entirely. Do you have tangents and standard shader applied?
I am struggling with this too…I have posted about it over on the UMA forum.And I have also found a function here that does a better job of RecalculateNormals than Unity seems too, but if there is some other way that involves ‘Tangent Space Normals’ I’d really like some more info…
I have done a google search for this and found nothing, I cant even find an explaination of what they are beyond how they are calculated, which doesn’t enlighten me at all…
Any info or explanation you could give would be super helpful…
Just wanted to chime in to say I never figured out a solution. I did some Googling, but shader stuff is beyond my knowledge, so I’ve tabled this for now.
At some point I may try to contract an expert shader writer, like echologin, to make something. If anyone wants to go in ($) on purchasing a solution please PM me.
It totally is by design, and isn’t a shader issue.
This is a difference between real time and offline rendering, it is simply too expensive to recalculate normals every frame, and how real time deals with hard edges. You’ll have to rely on bone weighting and splitting edges to make this work.
So basic explanation: In real time any edge that has a hard corner gets split up and what was once one vertex in something like Blender or Max is now two (or more) vertices in real time. This means each vertex only needs one normal direction and nothing else instead of needing to know it’s neighbors’ positions to infer the normal, or have multiple normals stored, or have a normals stored in the face, etc.
Where this comes to bite you, in this particular case, is you have the vertex point of the “knee” 100% weighted to one bone. When this gets turned into a real-time skinned mesh there are now multiple vertices weighed to that bone and all being rotated. What you need to do is break the mesh up in your 3d modelling program along the seams like it has to be for real time and weight them to separate bones.
For smoothed out meshes this issue is less noticeable, but most AAA games will actually have several extra “dummy” bones that exist purely for dealing with issues of mesh deformation and bones. Usually they get built into the rig and aren’t ever animated directly. Even blend shapes have been almost entirely abandoned by most AAA games and are just replaced by a shit load of bones because it ends up being more efficient.
Thanks for the info, bgolus. It really helps. I actually tried splitting up the model, but didn’t think to assign the vertices to different bones.
I really wish Unity could have offered an answer like this. I filed a bug report, and was just told it was “by design”. No further workaround or insight. As a customer who has spent several thousand dollars, I was a bit underwhelmed.
I’m interested to hear that AAA games are using loads of bones rather than blendshapes- thats the approach that UMA is using and thats the path I am going down (blindly) too so its good to hear thats what tthe ‘Big Guns’ are doing too…
With regards to the normals issue tho, I have found significant differences in the way MAX exports compared to the way Blender exports…
i.e. Somehow MAX seems to export Normals that are not ‘fixed’ in that if you deform a model exported from MAX it seems to get shadows/highlights that are correct on that deformed mesh- where as when exporting from Blender the normals appear to be static- i.e. if you deform a mesh imported from Blender the shading on the model is always the same even if you change the geometry.
I understand how recalculating the normals is expensive but I dont understand why it is that medels exported from MAX seem to just work better than models exported from Blender?
Every modeling tool deals with normals a little differently. I don’t know if the issue is with those programs or with how Unity chooses to import them.
You could get around the issue by just letting Unity calculate the normals, though depending on the way they’re exported it might smooth out some edges you don’t want smoothed.
This might shed some light on the subject: https://developer.blender.org/T46019 Apparently Blender has ‘…no way to access normals from a shape-key currently…’ It says it’s a ToDo, but no time frame given or found
Chiming in to say that there’s a relatively cheap solution for this issue (which is common to all realtime skinning algorthms). Unity’s built-in skinning would benefit enormously from this:
Instead of recalculating mesh normals from scratch every frame after bone deformation (which is correct, but slow, and does not respect hand-authored normals) or assuming linear blending of normals is good enough (fast, but incorrect), this approach precalculates and stores a bit of extra data in the mesh that allows for both fast and correct handling of normals for skinned meshes.