Super excited to experiment and play around with Kinematica, but I’ve run into a bit of a snag with creating assets. I’m attempting to use the Huge Mocap Library and this Invector-3rdPersonController_LITE that I found on the asset store to generate some Kinematica assets just to try stuff out and see what yields good results, but I cannot for the life of me figure out how to retarget any mocap animations onto the rig I’m using. The bot will follow the trajectory dictated by Kinematica, but will remain in a TPose and stolidly refuse to animate. I’ve tried using the ‘retargeting source avatar’ in every configuration I can think of (all humanoid, all generic, source humanoid but clip generic, etc etc), to no avail.
I feel like I’m missing something obvious. And I’ve added a couple screenshots below to clarify my configuration here. Any help or insight would be appreciated!
Unfortunately Kinematica doesn’t support humanoid animations nor retargeting, we used to support it. We tried to support it at some point (hence the “Retargeting Source Avatar” still present) but it required a very awkward and cumbersome workflow to make it work (because we needed to maintain humanoid and non-humanoid of the rig among other things).
Kinematica will support retargeting once it will be ported to DOTs, but for now we don’t have a good retargeting solution for non-DOTs projects and it’s unlikely we find any for the aforementioned reasons.
How is it that Kinimatica doesn’t support Humanoid but this asset does? Just curious to know what would cause this on the back end side of things. They are using similar tech but I know Kinematica has a lot more going on.
Humanoid animations have several drawbacks over traditional transforms animations : they can’t be sampled in Burst jobs which make sampling significantly slower, and they are lossy (especially the fingers have fewer degrees of freedom).
Since we cannot retarget them (efficiently enough) anyway in Kinematica, we decided it not support them for the moment. But this could change if we realize it’s actually a feature needed by users.
For most users who aren’t doing mocap themselves/don’t have the ability to and want to do a quick prototype with the available free resources. There will have to be a lot of retargeting in another DCC app. It’s easier to use a humanoid setup and various mocap libraries that are free. Although to get the best results you will need dedicated mocap that’s suited for motion matching.
That’s a good point, we didn’t think about too much about that, we have a tight roadmap but we will see if we can add humanoid support at some point in a near future.
Nice! If it’s easier to make a tool that converts animations to Generic this would work as well. This way work wouldn’t have to be done that might be removed later and not too many people might use. I say this because it seems DOTS Animation will be able to do the retargeting whenever it’s ready.
To clarify: once Kinematica has been ported to DOTS, it will support retargeting AND humanoid animations?
Asking because many Unity users have invested heavily in Asset Store humanoid animation libraries, on the understanding that they will remain useful and reusable in the long-term.