Humanoid pose animation leveraging Mecanim&Jobs

Hi all,

I’ve recently started a small project, I’m aiming to combine things like pose-blending (a la Overgrowth) and possibly full-body IK constraints later on.

To get started on the pose blending, I started out using Mecanim’s Humanoid features, since it would allow me to retarget the same poses across different humanoids. That’s the only type of character I’ll be focusing on for now.

I’ve made a little bit of progress, but my next step is using the Animation C# Jobs API to perform blending.

This is where I’ve hit a kind of a bumpy road, because the actual workings of the Mecanim system are quite a mystery to me, to be honest.

I’ll illustrate one example here, maybe someone else can chime in:

I’ve currently experimented in simply getting a HumanPose and applying it to another model, and it seems successful (not in a Job yet though). So there we have retargeting down.

The next step is to possibly bring this all into an AnimationJob. I notice we can act on an AnimationHumanStream via stream.AsHuman(), and it gives me some functionality to set muscles with the Get/SetMuscle functions.
However, I think I can’t directly set a HumanPose into a AnimationHumanStream; there’s only a function that can act on a specific muscle, not one that can take the whole array of muscle values from a HumanPose.

Any thoughts on how I can dump that pose into the animation stream?
If simply setting all the muscle values is not possible, how do I correctly map a HumanPose array to the individual muscle handles?

The next step, should I succeed in that, is to figure out if it is feasible to blend between two HumanPose directly. Since the muscle values aren’t documented (as far as I know) I can’t know if this is going to be easy or even possible.

That’s it for now, I’ll follow up with other stuff as I tinker with it.

Also, I hope you don’t mind this, but I’ll ping you @Mecanim-Dev

Made a bit of progress, still sketching things out.

Right now I haven’t worked on grabbing the required data in editor/as a pre-processing step, I run an animation, use a script to grab the desired poses and dump them in a ScriptableObject type I made. These poses are then copied to the animation job.

I discovered that the muscles array I grab from the HumanPoseHandler can plug straight into the animation stream with SetMuscle, it’s done per individual muscle handle right now, would be cool if there was a function to completely set all the muscle values in one go.

I’m still facing a lil’ bit of trouble, but here’s what I got to show for now:

wickedtintedkakarikis

You can see the trouble I meant, the pose resets to the fetal position (which means all muscle values = 0), it’s probably an error/issue on my part.

hi, that seems to be a still picture (doesn’t play for me)

so you use HumanPose and setMuscle with animationJobs?
could you share the code (the basics)? there’s not much material on this subject

@bobadi yes, I basically do that.

Sorry the video isn’t playing, here’s link to that embed: https://streamable.com/ovespf
However it doesn’t seem to be working, so here’s it uploaded to another site: https://gfycat.com/wickedtintedkakarikis
I’ve updated the earlier post.

Let me prep the code, I’ll post it here. I’ll ping you.

I agree, there’s next to no info on this, but I thiiiiink I can get something workable… at least for retargeting phase, can convert to normal Transform manipulation during a preprocess phase maybe?

this looks like just poses

here’s some project in no-jobs conventional code with humanpose gettin and playing it in lateupdate,
but don’t know if you can make any use of it

Yup, it’s just poses. The goal is to make a good workflow for pose-based animation with this, with custom interpolations.

I’m trying to keep it jobified so it will at least support being run on separate threads, not constrained to main thread code.
All this is mostly exploration.

By the way, the sample with code is attached, feel free to take a look, tell me what you think, and maybe spot the fix to the issue lol:

Made in Unity 2020.1.2f1, in a URP project, depends on Unity.Collections package!

6334392–703092–HumanPoseTests-v1.unitypackage (1.73 MB)

SILLY ME I was actually doing the whole interpolation thing COMPLETELY wrong. Apologies.

What I was doing in that sample was interpolating between current muscle value and the current pose muscle value.
What I should have been doing was interpolating between last pose muscle value and current pose muscle value.

I’ve made it work sensible, now interpolating the body position/rotation aswell, and it looks like a good start! All the three characters are using poses I copied earlier from the blue bot.

Moving forward, I should look at mirroring (how would I accomplish this still in muscle-space?), as well as perhaps checking out respecting muscle limits and other configurable stuff in a Mecanim humanoid rig. Hmmm.

Have a look:

whisperedgoodiriomotecat

I checked out naive overshoot, but I don’t think it’ll work well with muscle-space values, might be able to figure something out, could do more pre-processing maybe, to mask DoF (degree-of-freedom) and use muscle limits?

easygoinglegitimatelemur

Update: I got mirroring working.

It’s done as preprocess on the pose storage ScriptableObject. There was an EXTREMELY helpful forum post:

MecanimDev posted some crucial info there.

I realize I can save myself the trouble of the mirroring code, when I add the actual functionality to grab HumanPoses from a sampled frame/time within an AnimationClip. That’ll have to come later though.

I’ll need to look at improving the interpolation quality next. Any idea how I can use an AnimationCurve within a Job?

After improving the interpolation, if possible, then I think the pose interpolation is basically done! Again I am proceeding warily because it is possible that different interpolations may not play well with the Mecanim muscle-space stuff.
If things pan out, maybe I will look at AnimationClip pose grab and a neater editor workflow.

After that then I think I’ll move on to simple secondary motion, then perhaps fullbody IK and constraining.

Have a look at the current state:
lividbarreniguana
That’s the original two poses I captured, then I appended the two poses mirrored, and am playing back thru those total four poses.

1 Like

Did some work on improving interpolation. I think I have good results!

I used these resources as my starting points, since I am not very math-savvy myself:
Catmull-Rom interpolation:

Cubic interpolation:
https://www.paulinternet.nl/?page=bicubic
The code is a lil bit messy, but I think the implementation I did is sound.

Have a look, the first character is using Mathf.Lerp(current, next, t) while the second character that is panned to is using CatmullRom(previous, current, next, future, t). Right now I’m playing the animation in a loop, so all the poses “wrap around”. E.g. if I have total four poses in array, zero-based indices are current = 0, next = 1, then → previous = 3, future = 2.
I imagine I might have to implement another way for when there’s no loop; the cubic reference page mentioned some ideas I think are pertinent.
blacklimpingbighornedsheep

I did try comparing with Cubic also, but it seems extremely similar to Catmull-Rom; maybe someone who is more versed in math can explain the pros/cons of using one over the other? I’ll be using the Catmull-Rom interpolation from the link above for now tho.

I will also note another resource, one that I haven’t been able to wrap my head around yet:

It gives some pointers on how to improve Catmull-Rom (as well as notes something about differences with Cubic I wasn’t fully able to understand); anyone can explain this to me as well?

Anyways, made good progress!

Still working on things, need to iron kinks out and improve usability for now, so I can move on to new features.

Here’s an extra video :slight_smile:

soupylineardwarfrabbit

Did some extra tinkering.

I converted the work from an IAnimationJob to a normal IJobParallelFor. This means I removed all playables code, plus a lot of other extra code I had. I simply interpolate the poses in the job, then in LateUpdate I apply the whole pose result using HumanPoseHandler.

From my “”“testing”“”, though not really clearcut profiling, the IJobParallelFor seems on average 1~ms faster.

However I am losing the ability of plugging into the animation stream, and chaining animation jobs too I think.

Another thing I could try would be a hybrid approach, this also seems sound. Do interpolation within a normal parallelfor job, then apply the pose inside an animation job. I’ll try this out.

In the animation job version, completing full pose interpolation seems to take average 0.6~ms. While looping thru and setting all the muscles takes 0.2~ms.
I feel like it’d be nice if animation stream provided a function to blanket-set the WHOLE muscles array, instead of individual handles. This would simplify a bit.

I’ll also try to do more indepth profiling later. Just to see how the different approaches stack up.

@hsnabn Have you had any progress on this? I’m trying to do something similar