NetworkManager: Joining Servers cause Sync Message errors

Here’s a problem that I’ve been facing for a little while now and I would really appreciate any help you may be able to provide as I’m really scratching my head here.

My game has two modes: a standalone server (we’ll call this dedicated) and the client.

Dedicated Servers are booted by the following:

  • On first scene, check for a dedicated server command line parameter
  • If step 1 is true, then continue to step 3. Otherwise continue to title screen.
  • Load the server start scene that does sanity checks, reads the configuration file and boots the server by using NetworkManager.StartHost() with the configuration file defining the IP, port, etc.
  • After the host is started, the server switches to the initial scene by calling ServerChangeScene().
  • When the initial scene is started, other scripts start up as well to start the game round, etc.

Clients are booted by the following, for testing purposes:

  • Enter a IP address and Port into a UI Textbox element and click “Connect”. These are cached into a variable cache (think dvars from Quake).
  • The scene changes to a client load scene that reads the cached input with a NetworkManager in the loader scene. The NetworkManager gets the configuration, and uses StartClient() to connect to the server and join the game.

However, there is a issue here and it seems that the NetworkManager upon calling StartClient() tries to load the Network Identities that are in the server scene before actually switching to the server’s level. This causes a load of “Failed to get sync message for object with NetId 841…” (841 is just an example) errors and while the scene seems to load it’s geometry, everything else including network scripts are broken.

NetworkManager.ServerChangeLevel() says in the documentation:

This causes the server to switch scenes and sets the networkSceneName.

Clients that connect to this server will automatically switch to this scene. This is called autmatically if onlineScene or offlineScene are set, but it can be called from user code to switch scenes again while the game is in progress. This automatically sets clients to be not-ready. The clients must call NetworkClient.Ready() again to participate in the new scene.

My question here is that if I call ServerChangeScene() on the server and it changes the level correctly, do
“fresh” clients that connect to that server automatically load the same scene as well? Or does ServerChangeScene() only send out a message to the connected clients to change the scene?

One way I did have some luck is loading the level first, then using StartClient() to boot up the NetworkManager. It then synchronized up correctly since the Network objects where present in the scene already and allowed me to spawn without problem. However this is kinda risky if the client joins a match in the dying moments of intermission and the server has begun changing the level since the client just loaded the (wrong) level.

I appreciate your time to help me out.

Yes, that’s a HLAPI bug. I fixed it in HLAPI Pro: https://forum.unity3d.com/threads/unet-hlapi-pro-taking-unet-to-the-next-level.425437/

I just took a look at your drop-in replacement, @mischa2k . I will be evaluating it later today, as this might just be the magic ticket that will put my networking code back on track after being stuck for a while.

1 Like

Alright, after some experiments yesterday I was still able to get the Unet Sync Message error on Unity 2017.2. However, this was probably because my hacked up code was trying to do too many things all at once and caught fire.

Did you report it to Unity?

Not yet - I was putting your DLL in the wrong place.