Ragdolls need a mind of their own to look real and dropping them limp and lifeless on the slightest nudge to a Capsule Collider on a RigidBody looks far from real. Running over people, crashing a motorcycle, pushing NPCs off a ledge or jumping over a flight of stairs to see if the game has fall damage might seem like a grim thought at first, but in reality, these actions test the depth of a game.
An obvious answer to increase intractability of a game would be to use more animations for tasks like picking up objects from the ground, or dancing on the street for fun. This may work for thought based movement, but adding more raw animations for a physics based movement just won’t cut it. Frankly, animators cannot animate tens of hundreds of motions depending on where the character was hit last and the subsequent hits in that combination. Moreover, having animation scripts running for thousands of lines calling the Animator.Play() method does not seem like an appealing idea for the programmers either.
This is where active ragdolls shine! These procedurally generated animations automate the painstakingly difficult task of trying to animate every muscle. Take a look at what blend trees, parameter based variables and randomized algorithms can generate :
These movements have been put together by mimicking a series of animations loosely while maintaining the physicality of a ragdoll put together by joints. Inspired by a GDC Talk - Animation Bootcamp: An Indie Approach to Procedural Animation where David Rosen explains the usage of simple procedural techniques to achieve interactive and fluid animations amazes us. We have tried replicating the pointers in our asset - Simple Motorcycle Physics and this has given us a convincing outcome.
However, the concept of active ragdolls is certainly not novel and companies like Natural Motions have already collaborated with Rockstar games to make an AI based engine for their characters - Euphoria engine. The interactive ragdoll simulations look as if mocap data was being projected through a screen. This realism comes at an extremely steep cost, deep embeddedness in game systems and generally a restricted proprietary code base integrated on the discretion of Natural Motions.
In traditional ragdolls, movements sometimes jitter and lag, or limbs get caught in terrain shaking violently on the floor or stretch out and separate. Although there are a few drawbacks with the current PhysX based model with inaccuracies in joint projections and dislocations occasionally, the simulation comes along nicely if the project settings are tweaked properly. If the initial velocities are clamped correctly, most of the projection problems and physics solver having to over work can be solved. Moreover, the range of physical animations not using specified AI but a general set of animations could in theory be expanded to quadruped or even multiped skeletons.
Both the simulation types can come really close because the basic biped kinematic design is shared. On the left of these comparison images below, we have the AI mediated Euphoria Physics Ragdolls, and on the right, we have blend trees, parameter controlled randomized algorithm ragdoll physics. Even though self preservation, up righting, and spatial awareness is missing from the latter, by tweaking the probabilities of the blended animations, we can imitate the aforementioned features.
We are glad to share the results and looking forward to queries, suggestions, and are hoping to see more games use these nifty active ragdolls to drastically increase the intractability of their games.