I am currently working on a multiplayer game. I developed an interpolation system upon my gameplay architecture and it works nice. I heard a lot about Unity network system weaknesses and strengths and I want to try the custom approach with .Net sockets (to perform multiple comparison and train myself about networking). I am close to have something functional with it but there is one feature that I should implement: Network.time.
I would know how this value is synchronized. I made few stress tests about it and it’s very accurate. The thing is, can you tell me how it’s determined and sync across clients to implement it with my UdpClient.
My current usage of it is to “timestamp” packets for interpolation system.
There are a few ways to sync time, which all serve different purposes and may or may not fit you, the options you have are:
Do not sync time at all, instead sync simulation frame numbers (well suited if you are doing something which exclusively does all simulation in FixedUpdate)
Synchronize time once at the start of the connection handshake between the client and server
Continuously try to synchronize time between the client and the server as the game progresses
Now, option one is a bit specialized and is often a bit harder for people to grasp, option two works but I’ve found it lack-luster when pings, etc. change over time as you play the game. Option three is what most people think of when they say “synchronize time”, and is the way I will describe here:
In the header for every packet sent from both the client and the server, attach the current simulation time
When you receive a packet, read the time from it and apply this integrator: localOffset = (localOffset * (1 - INTEGRATION_RATE)) + ((localRealTime - remoteTime) * INTEGRATION_RATE);
I find that an INTEGRATION_RATE value of 0.1 usually works well.
Now to stop a few “annoying” things from happening, like time going backwards you need to keep track of the previous remoteSimulationTime and make sure that you only update the localOffset if “newRemoteTime > oldRemoteTime”
Also on the first packet you arrive, the calculation is simply: “localOffset = (localRealTime - remoteTime)” as you have nothing to integrate with
Now whenever you want to find your own “local time which is relative to the servers time” you do this: localSimulationTime = localRealTime + localOffset.
As I always send the server time (used to timestamp packets for interpolation system) in the packet, using this algorithm won’t require more data compared to what I’ve already done.
Thank you for your crystal-clear explanations. It should be far enough for me to be able at implementing it !