Vertex animation (from vertex list) in shader

I am writing a client that supports rendering of an old game engine format. They have a character animation system which uses bones and is pretty straightforward. They also (for some cases) use a vertex animation in which they actually specify positions of each of these vertices per frame.

The very naive solution here is just to resubmit the vertices for the mesh via code but this is CPU intensive and slow and required the re-uploading of vertices. I can benchmark it but I assume I will not go with this method.

The next thing I can think of is a shader. I could send all of the vertices at once and then in the shader, use a frame index to calculate which vertex to actually sample from. Does this make sense?

Has anyone done something similar to this?

Do you need to blend between frames? If not I would suggest generating a unique mesh per frame and simply swapping between them.

Alternatively I would look at how something like instanced skinned mesh animation works (see the 4000 Adams thread). The short version is you store all of the position data in a texture, one row per vertex, one column per frame, and animate by setting a “frame” as a property on the material.