VertexAnimation - a fast, GPU accelerated mesh morpher

There is this MeshMorpher: http://wiki.unity3d.com/index.php/MeshMorpher
which interpolates between two meshes with the same number of vertices.

However, each frame it modifies the vertices property of a Mesh on the CPU side which results in bad performance - especially when there are many objects animated with MeshMorpher. I’ve written a GPU accelerated mesh morpher, and I’ve felt like sharing, so here it is. :slight_smile:

It is capable of interpolating between 2 meshes, each of which must have the same number of vertices. I am using the tangents array of a Mesh to store the second set of vertices. Blending between the two meshes is done in the shader. There are two shaders included: VertexAnimationDiffuse.shader, and VertexAnimationSpecular.shader.

I’ve experienced a huge performance gain, animating > 100 models compared to MeshMorpher.
The whole thing isn’t rocket science, but rather simple. Anyways, maybe it is helpful for someone.

What do you think of it?
Feedback is most welcome.

Since I’m not very experienced with Mechanim, I’ve wondered if there is some Mechanim feature which could do the same. Or is Mechanim only working with Rigs? I want to avoid, having to rig all my simple models which I can animate using VertexAnimation, because for some of my models it is sufficient for me, to just interpolate between two keyframes. In such cases, I guess, VertexAnimation is the fastest solution since it doesn’t have to calculate bone matrices compared to Mechanim. (Unless Mechanim has some fancy optimizations, I am not aware of) Any thoughts on that?

UPDATE:
I’ve updated the attached unitypackage. Now it should also work on mobile devices.
Basically, I’ve only added the following #pragma statement to the shaders:

#pragma glsl_no_auto_normalization

UPDATE 2:
Normals are now also interpolated correctly. The normals of the other keyframe are stored in a mesh’s color attribute.

1240084–69894–$VertexAnimation.unitypackage (376 KB)

1 Like

Wow, this actually has me excited. I’ve been looking for an asset on the asset store that allows me to run animations on the gpu (either 100% on the gpu, or some combination – like 40% gpu, 60% cpu). I’ve also been scouring the web for any information on how to animate with shaders, and after about 4 days of searching I havn’t found anything worthwhile (for Unity specifically).

I really want to try this, thank you for sharing it. Only problem is that it’s really late for me tonight, and I may not be able to try this out until the weekend. If this allows me to run animations on the gpu, I think this will really help my game to progress to where I want it to be.

If you can, I’d encourage you to work on this even more, perhaps try to find optimizations and such, and then sell it on the asset store. If it’s even half of what I’m looking for in terms of animation performance, I’d buy it. :slight_smile:

Anyways, like I said, the potential for this has me very excited, but I have to wait until I have more time to test it out. I can’t wait to try it.

Other Thoughts:

I have a couple questions after reading your post again. You said you have it set up to work with two meshes. Can it work with more? For instance, I have 30 separate meshes from one of my model characters that represent 30 “positions” in an animation sequence. Could I use this shader to morph inbetween each of the 30 meshes? Mesh 1 is the beginning of the animation, mesh 30 is the end.

In other words, could this shader be used to animate a “normal” biped character with say idle, walk, and run animations?

Also, I kind of need a little instruction on how to set it up. I’m assuming you attach the c# script to the model i’m using? And then use the shader where shaders are normally used? Any help on getting it set up would be appreciated. I havn’t actually tried to set it up yet, but I’m trying to plan ahead :slight_smile:

I’ve updated the .unitypackage and included a scene with an animated object.
Please download again.

@Velo222: You could use it for animating between your 30 keyframes, but you always have to upload the 2 relevant keyframes manually by using the VertexAnimationBehavior’s methods SetKeyframe1 and SetKeyframe2. Since these operations update the vertex data of a mesh, data is uploaded to the GPU every time you change the keyframes which has an impact on performance.

My VertexAnimation is intended for very simple animations. I use it for some small enemies in my game which have a very simple fly-animation - so basically, animating between 2 keyframes is sufficient for my case. I don’t think, I can extend my technique to being able to handle 30 meshes. Uploading 30 keyframes/meshes to the GPU is generally a bad idea since it is very memory intensive.

For your case, I believe that using Mechanim is the better choice. If you can prepare your model to work with Mechanim (which means: rigging the model, I guess) that would be the way to go for you. My technique can not handle bone animation/vertex skinning - that’s what Mechanim is there for.

Thanks for the info j00hi. I will download your updated version, and hopefully try it out sometime soon.

I have my models rigged already, and they are working well. The problem is that the skinning/animating in Unity is all done by the CPU, which was a design decision the Unity programmers had to make (and I don’t blame them). However, since my game is largely CPU bound, I really need ways to offload the animations, either in whole or in part, to the GPU. Quite simply, using a skinned mesh renderer on hundreds of units puts a big hurt on the CPU.

Any amount of work I can offload to the GPU would be good for my specific game. All that being said, I will try out your shader when I get time (hopefully in 2 days). Thanks again for the information.

Oh, I didn’t know that vertex skinning is done on the CPU side!
Just read through the thread of yours: http://forum.unity3d.com/threads/178383-Animations-cpu-bottleneck-would-mecanim-improve
and posted some thoughts of mine.

To me, it looks like you have to use one of the proposed optimization methods, which there are:

  • Instancing (the manual way, like described in the referenced thread)
  • the RenderTexture → Billboard thing
  • using my VertexAnimation for simple animations
    or even a combination of those.

The only thing, which I could add to VertexAnimation is a third keyframe, but I’m having trouble with that:
Originally, I wanted to store the second set of vertices in the colors attribute, but somehow this doesn’t work. Maybe colors get clamped or saturated… don’t know. Would have to investigate in that further. Being able to use the colors would open up 2 new possibilities: 1) Interpolation between 3 keyframes, and 2) Interpolation between 2 keyframes + Bump Mapping (which is currently impossible with VertexAnimation)

Hi, this is a great idea. Thanks for your share.
But I try in SkinnedMeshRender, the result is weired.
Have you tried in SkinnedMeshRender?

No, I haven’t tried with SkinnedMeshRenderer.
Actually, this is not intended to be used with SkinnedMeshRenderer, but instead just to morph between two different meshes which have the same number of vertices.

I’ve updated the unitypackage in the first post.
VertexAnimation didn’t work on mobile devices (Android, iOS,…) since normals and tangents are automatically normalized there, by default.

Would it make more sense to store the information in the color data, instead of the tangent data, to make support for normal mapping? Sounds cool, though, have to give it a try

Yes, absolutely. I actually wanted to store the information in the color data. But the color values get clamped before they are uploaded to the GPU - definitely all negative values are clamped to 0, I’m not sure about positive values.
Do you have any idea how to disable that automatic clamping?
Maybe there is some #pragma statement or anything else?

No, I have no idea, sorry. I actually didn’t think it would be clamped. Another thing: How do you morph the normals of the target mesh, if your solution aren’t using normal maps but just the normals of the vertices? I mean, just morphing the positions of vertices won’t give you correct lightning along the way.

Yes, you’re right - currently only the vertex positions are morphed which leads to wrong normals. Currently the normals of the first mesh are used always, whereas, the normals should also be interpolated between the two meshes.

Playing around with vertex colors, I had the idea, that i could use the vertex colors for the second set of normals and use them for interpolation. Since color values get clamped, they would have to be stored just like in a normal map: normal * 0.5 + vec3(.5, .5, .5)
I think, this should work. I will try this as soon as I have some spare time.

Unfortunately, this definitely means: no bump mapping with VertexAnimation.
Bump mapping needs normals and tangents, but VertexAnimation needs the tangents for the second set of positions, and the vertex colors would be utilized for storing the second set of normals.

This is a good idea! I might use something similar, but not the same thing. Thanks for the inspiration :slight_smile:

Updated the .unitypackage, now also vertex normals are interpolated correctly.
Vertex normals are stored in the mesh’s color attribute.

Hi j00hi,

I have been searching for a way to morph a terrain mesh between two states within my game. I am a beginner to unity and scripting, so I was wondering if you could give me a basic explanation for how to implement your .unitypackage? I have created two mesh surfaces that represent my landscape, one is a flat plane and another is a highly articulated canyon landscape. They both have the same number of vertices. Is it possible to initiate your mesh morphing script once a collider box is triggered? What are the basic steps to executing this?

Thanks!

Using it is very simple: Just drag a VertexAnimationBehavior script on the GameObject which you want to morph. It needs to be the object which has a MeshFilter component attached. The referenced mesh would be your Keyframe 1.
VertexAnimationBehavior has a property which is visible in the inspector named “Other Keyframe”. Drag the mesh of your Keyframe 2 onto that property.

Set the other inspector properties “Loop Duration” and “Loop N Times” to 5, and 0, respectively that should automatically animate when you press Play.

You can also trigger the animation by code by calling the LoopNTimes method.