I’m having a bit of a rough time trying to come up with a good way of getting animations ready for games and i thought i’d see how other people generally handle it and see if i can learn some good habits. So please feel free to go into length about the ways you’ve streamlined the experience for yourselves.
Any answer will help in the future to any member of this community.
1: I rig my characters using Blender’s Rigify. It’s a modular rig system, you first construct a “metarig” where you add any number of arms, legs, tentacles, spines, heads and such, then you hit “Generate” and it creates a very convenient rig with FK/IK switching etc.
2: I don’t let Unity import the .blend file directly because then all the helper meshes end up in Unity. Instead, I save my .blend files in subdirectories called “~” (which Unity ignores) and export only to .fbx from within Blender. This allows me to only export the Armature (skeleton) and select meshes as well as only export deformation bones.
3: For each set of animations, I create a new .blend file that links the involved characters (to avoid duplication) from their source .blend files. This keeps my animations separate and speeds up Unity imports because only a subset of my animations needs to be reimported when I change something. Also enables me to easily do paired animations involving multiple characters.
4: I use a Python script to export the animations to FBX. This is necessary because Blender’s flipping FBX exporter is moody as heck, so my script loads the original character .blend file, creates a dummy mesh, appends the animations from my animation.blend and exports only the dummy mesh and the animations. Small FBX files, easily split animations into individual FBX files if needed (UE4…).
As for the process of creating animations themselves, I use Blender’s action editor with manually placed keyframes almost exclusively. Scrubbing, instant playback (Alt+A), clean separation of animation clips, exports nicely and I can check transitions with Blender’s NLA editor.
Blender can overlay videos into the viewport and they scrub with the animation timeline, too - so for walk cycles etc. I just put a video reference above or below the character and use that for the timings and initial pose.
For some reason, Blender’s viewport has horrible performance. So it’s important to enable frame dropping in the timeline or the animations will play in slow motion.
To avoid robotic animations, it’s important to always think of momentum. Nothing starts or stops instantly. Second important trick is to not line up your keyframes neatly. If a character swings his/her harms when walking, they will be slightly out of sync with the legs and that little bit of timing often makes all the difference. Third is balance. Example: when a character starts moving, his/her torso will move first, “falling” towards the direction the character wants to walk to, only then the leg follows. Lots of little details, but these are best learned by doing
Principles of Animation should be referenced while animating. Know when they can be applied, and when they can’t (because of real-time cycles).
I like to animate in passes, first pass is blocking. I like to use a stepped key or linear key tangent. I’m only concerned with poses at this point. Second pass is refining & tweening. I usually swap all key tangents for linear or smooth keys to get motion from each pose into the next pose. Don’t underestimate moving holds here for anticipation and high speed action. Third pass is refining and polish. Refine the keys and change to bezier tangents to control easing to and from keys. Cloth and fx are worked on last.
One character may take 5-10 hours to animate. If there are 2 characters interacting with each other multiply the time for animating one character x4.
I always work on morphs and facial animations last on a character. They are like particle systems - a huge time sink and can be played with for hours upon hours without any progressive results.
I animate facial expressions and phonems/visems in passes - the same way I do complete character animations. I treat the face as a completely separate “character” that needs to have it’s own animations passes to give time and attention to the extra details.
Over-exaggerate all poses, expressions, wind up, follow through and overlapping motions as often as you can. Realistic animations - even mocap - still look fake because the motions aren’t over exaggerated. Pulling an animation back from over exaggeration is a lot simpler than trying to squeeze in more exaggeration between key poses.
When creating morphs - its OK to create extreme facial features and mouth shapes with overlapping mesh pieces, like the lips and eye lids. The morphs will be animated animated exact - and might need that slight exaggeration to show the lips are actually compressing together and the eyes are closing all the way.
If time permits - exporting each animation as a separate fbx file is good future proofing - for any animation editing needed down the line.