Please forgive me if this is a trivial question, but I am looking for a solution that will adjust the scale and positioning ( maybe rotation? ) of the bones based on blendshapes.
So, if I have a base character avatar, and I want to apply blendshape(s) that dramatically changes the scale of it’s limbs, then this solution would adjust the bones accordingly.
Does such a solution already exist? I’m sure I could write a script, but would rather find an out of the box solution. It seems like a common enough thing.
“blendshape” usually refers to a tweened vertex animation.
And they’re driven by an external parameter.
So you CAN’T drive something by a blendshape, because blendshapes are driven by something already. Usually by your script, or by animation/animation curve and so on.
So. If you want to control state of the model by an equivalent of a driver, you can do that, but chances are you’d need to script the “drivers” yourself during import of the model.
To script behavior, you’d need to override either OnAnimatorMove or OnAnimationIK.
Thx for the info neginfinity. To clarify, I’m not looking for a real-time solution. The solution I was envisioning is something like this…
On startup, for each character, we traverse through all of the bones and lookup the vertex weights, calculate the vertex positions before and after the blend shapes ( via weighted averages ), then deduce the new scale and position of the bones. Rotations could be done, but that may be tricky, but also probably not necessary for the purposes of scaling different body types.
I’ve done this before on another platform in a past life, so not worried about it being possible. Just wondering if a Unity solution already exists
You absolutely can replicate do that, but you’ll either need to do skinned mesh transformation - and application of blendshapes - by hand within your script, or use bake method of skinned mesh renderer which would take a snapshot of current state of skinned mesh in the world.
The motivation for this is to have a base avatar that the user can customize via sliders, which could dramatically joint positions. So, if the character originally has narrow shoulders, and the blendshape makes them broader, then the shoulder joints would also move accordingly.
An alternative approach is to have each body blendshape have a corresponding bone rig, and we can generate the final rig via linear interpolating the position/scale/rotation of the original and the ones matching the blendshape(s).
Anyhoo, if anyone knows of an existing method of doing this, please let me know. I’d rather not re-invent the wheel if a method for doing this in Unity already exists. Otherwise, it’s not too big of a deal to script myself.
Like you tween between the bone positions and allow the vertex weights to adjust to the new points. Not to drive the bone positions from the blend shape.
It’d be much harder to go from blend shape to bone than the other way around because you’d need to kind of triangulate the bone position from the surface vertexes (which would be complex and error prone). You already have the weight painting on the surface, so adjusting the bones will ‘just work’ (within a reasonable range).
Without being too much of a character artist, I would imagine what frosted said about bones controlling the mesh is a better idea. The blend shapes could be used to control the musculature, because per-bone data would have a hard time getting the shape right.
Indeed, using the bones themselves determine the shape is an option, and I may indeed go with that. The motivation for doing it the other way around is that an artist can simply morph out a new character in a 3D package, and then the game can sort out the rigging itself. As long as they didn’t do anything too crazy, it would all auto-adjust. So I’m taking stock of all available options so I can choose one that will yield the simplest workflow moving forward.
The disadvantage of scaling/moving the bones to change the shape of the character is that it becomes more difficult for an artist to create new characters in a WYSIWYG fashion. They would be morphing details like muscles, etc. on the base mesh with the original proportions, and then seeing the results later when then bones stretch out the proportions.
aka. This is not for a one-off situation, rather for a game that will have highly configurable avatars.
Imagine what a nightmare it’d be if an artist sets up a blendshape, and the resulting bone position messes up some animations. What would the artists process be to fix it? Think of the amount of guesswork and inference required for touchups.
Indeed it would be hell if the solution did not work as expected. Anyhoo, the question is more of what solutions exist ( if any ) rather than “should I use such a solution”, hence doing my research first. The “should I” question will be answered at a later date.
Kind of structure. Basically, in this case each limb will have a “tweak” bone attached to it as a child, and rather than binding to the limb bones directly, you’ll bind skin to tweak bones.
The reason for that is that if you scale root bone, then scale will propagate to children. However, if you scale t weak bone, it has no children. Meaning, the scale will not propagate, and can make the body part fatter/thinner.
Additionally, you can have stretchy limbs this way. In this situation, you can simply adjust position of lower arm bone, and then scale tweak bone to compensate for distance change.
So, no matter how I look at it, trying to derive bones from blendshapes is the wrong way to do it.
Just make your sliders control joint positions and take a look at UMA, because UMA can already do this.