Master server usage causes severe slowdown

Hello everyone.
I was experimenting with Unity networking engine and came across a very odd problem. It is: in approximately 10 seconds after MasterServer.RegisterHost() call, Unity 3.5.7 starts consuming CPU like crazy, and my framerate drops drastically. If I look at the profiler, it states that Overhead is > 80%. Simply removing MasterServer.RegisterHost() from my code solves the problem, so that clearly seems like a bug to me, as there is no logical connection between massive framerate drop and master server communication…

Sample code:

using UnityEngine;

public class MasterServerBug : MonoBehaviour {
    private const string GameType = "SomeTestGame";
    private int _remotePort = 25000;

	void OnGUI () {
	    // Checking if you are connected to the server or not
	    if (Network.peerType == NetworkPeerType.Disconnected) {
		    // If not connected
		    if (GUI.Button (new Rect(10,10,100,30), "Start Server")) {
                // Initialize server and register at the master server
                Network.InitializeServer(4, _remotePort, false);
                MasterServer.RegisterHost(GameType, "My server", "test");
		    }
	    } else {
            if (GUI.Button(new Rect(10, 10, 100, 50), "Disconnect"))
            {
                // Unregister from master server and stop
                MasterServer.UnregisterHost();
                Network.Disconnect(200);
            }

            if (GUI.Button(new Rect(120, 10, 150, 50), "Unregister server")) {
                // Unregister the game at master server
                MasterServer.UnregisterHost();
            }
	    }	
	}
}

Just attach the script to Main Camera in an empty project.
You can see the code is dead simple. So the steps to reproduce the situation are:

  1. Start server
  2. Wait for 10 seconds or so
  3. Notice drasticall framerate drop out of nowhere
  4. Click “Unregister server” button, notice that framerate is OK now

I’ve also attached the screenshot of profiler while reproducing these steps. You can clearly see how huge Overhead appears, and how everything is back to normal after clicking the “Unregister server” button.

1127947--42682--$bug.png

Bump. Can anyone confirm this. I don’t believe nobody is using the master server…

you’re calling it endlessly sometimes many times per frame.

That’s not true. I’ve even tried the following barebones code (that’s the only code in the project, attached to main camera)

using UnityEngine;

public class Test : MonoBehaviour {

   void Start() {
    Network.InitializeServer(4, 25000, false);
    MasterServer.RegisterHost("SomeTestGAME", "My server", "test");
  }

}

It’s obviously called once (and I can see that in the Console). The result is exactly the same. 10x times framerate drop after ~10 seconds. Simply commenting MasterServer.RegisterHost() call fixes it… So it just makes no sense. I’ve tested this on a few different machines and the result was the same.

Sorry for the bump, but has anyone found a solution to this? I’m running 4.1 now and have the same issue. Game starts at approx 60fps and then after ~10seconds drops to about 40fps.

I have done the same as above tried commenting out the RegisterHost and I don’t suffer the fps drop but when I un-comment its dropping again.

Edit: Should add I’m definitely only calling it once in my code.

No, and it seems like Unity devs don’t care about it… There is also a similar thread, so it’s not just a single case.
I don’t know how to rise more attention about this issue… It’s an obvious bug, it has been reported mutiple times, and still not a single answer from the dev team.

That is a shame but at least I’m thankful to have come across your post because I would still have been clueless to what was causing it. Ah well we can only live and hope that it is sorted eventually, I quite like Unity’s networking features and rather not have to start looking for alternatives as I have enough to contend with atm.

Thanks

Hello there, I’m having the same issue : half the FPS drops (1000fps to 450fps) 10 seconds after MasterServer.RegisterHost() function is called. I also tried registering on my own master server with the sources provided, but fps drop still occurs. Have you figured it out yet ZimM ?
Looks like quite an important problem.
Also note that trying to register without internet connection don’t drop FPS.

Kushulain

No, I haven’t found any solution… I’ve actually abandoned using MasterServer for this reason :frowning:

I noticed the framerate drop is quite lower at “normal” (60fps) rate. As it takes a fixed amount of time per frame (1.5ms) on my computer, 3ms on yours it seems. At 1000 fps, it drops down to 450, but at 50 it drops to 47. So it’s not such a big deal actually, it takes a tiny amount of time.

And furthermore, as there is not much objects in my scene, my app is quite CPU limited rather than GPU. So this is why it looks like quite even deeper gap. Later, filled with multiple graphic objects, i expect the drop to be quite insignificant.

But still, it eats performances, and more than I expected.

1.5 ms out of nowhere is a lot. On mobile devices that’d probably be much longer, and I can’t just spend mobile resources for nothing. That’s why I believe this bug is actually quite severe.

I ran in this problem too, but I didn’t know that the overhead was caused by the master server stuff.
After I removed it the overhead disappeared (3-6ms of overhead). Fortunately I don’t need this feature, but others might.