I am struggling since days with something I thought I can solve in 10 minutes:
I have an Object A where I know a normalized Vector A1 (like the forward Vector).
The Object does arbitrary rotations; all I know is direction of Vector A1 after the rotation.

Now I want to calculate the position of Object B based on the initial relative position to Object A.

As I need to do the math in shader, I cannot use Unity internal methods.

To write the logic, I built a scene with two GameObjects. The following does exactly what I need:

So basically I just need to manually calculate the â€ślocalToWorldMatrixâ€ť based on the change of the forward Vector. But all methods I found to achieve this lead to a different behavior, e.g. any rotation around the Z axis does not work, the other rotations are not circular. I understand that the â€ślocalToWorldMatrixâ€ť consists of position, rotation and scale. Position and scale are easy to find, I am struggling with getting the correct rotation Quaternion.
Is this even possible with just having one changing directional Vector?

I have many vertices of a group A which all get individually moved and rotated on GPU already, now I have many more vertices of a group B which shall individually align to single vertices from group A. So each vertex of group A will have an individual matrix which, as I understand, constantly changes due to movement and rotation, and need to be applied on vertices from group B to align them. No CPU involved at all.

So there seems to be no easy way of doing so? I found out that the â€śskin attachment systemâ€ť from Unityâ€™s digital human package is doing something very similar. Code seems pretty complicated, though, as it needs to handle several other cases.
No one has an idea for a simple solution? Never thought that this basic looking thing gets so complicatedâ€¦