Hello everyone! Apologies in advance if my question might seem basic, but I’m not as experienced with shaders as I am with C#. So far, I’ve been mostly proceeding by trial and error.
What I’m trying to do is create a shader-based animation system to deform the vertices of a skinned mesh. This system relies on three textures (one for positions, one for normals, and one for tangents) baked using a MonoBehaviour script. The script captures the vertex positions, normals, and tangents of a 3D model during its animation in Unity’s Animator.
The result is an animation that, at least in terms of movement, matches the original model. However, as you can see, it looks completely fragmented. I tried averaging the positions of duplicated vertices and applying that average to the updated vertex positions computed by the shader. But since I baked the final vertex positions from the animation into the textures, any additional adjustments I make to the vertex positions (based on the baked data) result in further distortions of the original model.
I’ve been struggling with this for two weeks now, but I haven’t been able to find a solution. (As I mentioned earlier, I’ve been relying heavily on trial and error because I don’t fully understand what I’m doing.)
Thanks in advance for any suggestions or guidance!