I am developing a multiplayer VR game in Unity, using Photon Unity Networking. One of my characters in the game are using the Xsens Awinda body motion tracking system, in order to have a more realistic body which follows every movement. I am streaming this movement data into Unity, locally from MVN Analyze. However, I cant seem to stream these live animations over the network so the other players see the same as I do. The avatar has an Animator Component attached to it, but since the animations are streamed live from MVN, there are no layers or parameters to be streamed by using the Photon Animation View. Any advice?
You can’t directly stream animation. Instead, you have to sync all the transformation, key frames…etc.
There isn’t default method of streaming animation. You have to create your own data structure for streaming. and stream them as serialised objects via PUN.
The first step is creating your own “byte” data structure like: head position + head rotation + neck position + neck rotation + chest position…etc
This video may give you some idea for beginner: [FM Coding] Sync Network Objects, Encode/Decode byte data (Unity3D Beginner Tutorial) - YouTube
Then, you have to send those byte via photon, either by RPC, or serialised stream…etc.
And finally, others should be able to decode your data and assign those transformation to target avatars.
Makes sense! Thank you for your reply. I will check it out.