Oddities with Network.time and NetworkMessageInfo.timestamp

Hi there,

I’m currently experiencing some very weird issues with Network.time and NetworkMessageInfo.timestamp. I’m using this to do some extrapolations (old location + timediff * velocity = current location), and I noticed that sometimes, objects went to very strange locations and so I’ve added some extra-logging.

From my log, I’m repeatedly getting things like:

timeDiff = 245198.257; from Network.time = 282625.603, info.timestamp = 37427.346

(this is right in the beginning of various RPCs)

I also get timeDiffs in somewhat more sane ranges like 10 seconds (well, not really sane, but more sane, at least). This has happened with a current build that I used for a little “inoffical testing” with 35 clients (on 3 machines), but I guess it’s also happened with very little load before I had this logging in place. Also, this particular entry was a very short time after I had started up the server (less than an hour).

Not sure if this might be in any way related, but I’m currently using the beta Master-Server

Can anyone shed some light on this? How does Network.time and NetworkMessageInfo.timestamp work? What could be the reason for such failures?

Sunny regards,
Jashan

I am using the beta masterserver exclusively, and do not have this problem.

For example, to get a synchronized game timer between 2 clients I use this function:

@RPC
function SetGameTime( newTime : int, info : NetworkMessageInfo )
{
	playerCount = 2;
	gameStartTime = Network.time - (Network.time - info.timestamp) - newTime;
	UpdateGameTime();
}

And that adjusts for the time it takes for the message to get across the network so the players are as perfectly in-synchronization as possible.

I know your game uses slow motion, could Time.timeScale have anything to do with it? And just to double-check, you are sure you are always using Network.time and not Time.time somewhere?

Time.timeScale might be a good candidate. The implementation in my game for this is that the server has “constant time” all the time, so Time.timeScale on the server is always 1.0, and all motion is just “downsampled” by using an abstraction of time which simply multiplies timeDelta with a separate timeScale-factor - while on the clients, I’m using Time.timeScale so that all the particle animations etc. are also effected.

However - that odd behavior occurs when neither any of the clients, nor the server had bullet time activated (after a “fresh start”). And the behavior does not occur (more often) when I use bullet-time heavily. So that can’t be the cause of the problem. If it was, it would also mean that Network.time is dependent on Time.time, which is very unlikely.

I’m using Network.time exclusively with NetworkMessageInfo.timestamp inside my RPCs to offset the objects by the movement that occured while the message was “on its way”. This basically follows the pattern (this is simplified code):

public void SomeRPC(Vector3 positionOnSend, NetworkMessageInfo info) {
    double timeDelta = Network.time - info.timestamp;
    transform.position = positionOnSend + transform.forward * velocity * timeDelta;
}

To check for some very weird errors, I’ve replaced all instances of Network.time - info.timestamp with a method that does the same calculation, but also checks if the delta is above a threshold (I have this threshold at one second, since even with heavy lag, a message should not take longer than one second from one machine to the other).

Most of the time, this is working perfectly smoothly (even if I use bullet time a lot - which does add a little bit of complexity to those simple calculations, but it’s not that much complexity: the network timeDelta is simply multiplied with the appropriate timeScale to get the time the message took “relative to bullet time”).

Sometimes I get values above one second that I could explain with “well, the message got stuck somewhere in a queue”.

However, sometimes I’m getting timeDeltas of around 240000 seconds - which is much longer than the server or any of the clients even have been up. Once this has started, I seem to get a lot of those.