How do i calculate....

Just saw the pricing for the Multiplayer Services.

Would like to understand the best way of calculating:

  1. Messages/Player/Second
  2. Size/Message (Bytes)

The page has a calculator built in?

I know but how do i know #messages and the size?

I believe they are intending to add that debugging to UNet at some point where you will be able to query how much data is sent. Before then, you already know what you are sending and how frequently from the code you write… you are in charge of it. But I guess that’s pretty tricky to keep track of. Is this what you meant?

yes, thanks

I think you can use Process Explorer - Sysinternals | Microsoft Learn “Process Explorer” to check the network sent and received bytes per second for your build.

There’s a network tab to check delta received and sent.

However, using this, I see that my sent floats around 800 bytes per second, and received around 1.2 kbps for a build. If I calculate my sent and received through what I’ve written as suggested above, it’s vastly lower.

I’m getting these numbers with just a host and client, testing over Unet relay servers. All I have at the moment sending and receiving from my own code is some custom movement syncing code and 2 syncvars.

The movement code should only be sending 90 bytes per second for inputs, and receiving 200 bytes per second for resulting positions.

So I’m guessing the rest of that data is something Unet sends and receives by default.

Being rather new to networking, is this also the packet size per second? (MTU)? I’ve read you want to keep the packet per second under 1500 bytes, which is already extremely close if this is true.

1 Like

Here is my analysis from another thread here in the forum. Perhaps this might help ya.


So I’ve also been trying to wrap my head around the operating costs incurred from utilizing the Unity Multiplayer service as well, and here’s what it’s boiled down to (note that according to Relay Server Bandwidth Explanation - Unity Engine - Unity Discussions it seems as though bandwidth is calculated only by outgoing traffic from the relay server).

My game has a maximum of 8 players per game. We’ll assume the worst case that all games are always full.
In the worst case, each client sends movement data at a rate of 20/sec with the payload size being approximately 32 bytes per packet (sizeof(Vector3) + sizeof(float) + overhead padding).
The server streams all players’ movement data to each other client.
This means that the outgoing bytes/sec from the server to each individual client is approximately 8 bytes * 20 msgs/sec * 32 bytes.
This means the total outgoing bytes from the server is approximately 40kb/sec. Now, from my testing I’ve seen it’s significantly lower due to optimizations, but we’ll just go with this.
At 40kb/sec, or 0.00004096 GB/sec, means that each 8 player game hour costs $0.073728. Thus, each single player costs me $0.009216 per hour. Double-checking these numbers against results gotten from the price calculator here Game Development Software: Build a Multiplayer Game | Unity seems to line up fairly well.

So all this being said, and after some thought, the metric I’ve been looking closest at to evaluate the financial viability of using UNET is basically # of hours a player has to play for me to start losing money on them. At $0.009216 per player hour, they have 108 hours of game time before they cost me $1. If I charge $4.99 for my game, minus a 33% cut, they can play 360 hours before I lose money on them. $1.99 for my game? 143 hours of game time. I have a feeling that’s quite a bit more time, on average, than players will spend on a single game, especially an indie like mine.

For the sake of argument, let’s say I’ve made a mistake in my arithmetic (please, please call me out on it, I’d adore you) and I’m off by two orders of magnitude, i.e. they have 1 hour of game time before they cost me $1. If I charge $4.99 for my game, minus a 33% cut, they can play for 3.3 hours before it’s unprofitable for me… it’s bad, but I do wonder how many folks buy a cheap game like that, play it for an hour or two, and never pick it up again.

So anyway, I was kind of freaking out yesterday by this pricing, but after going through this exercise I’m starting to think perhaps it’s not completely unviable - or perhaps I’m just really bad at math? Anyone mind double-checking me here?

3 Likes

Calculators are fun! Anybody wanna work on my game with me?

2 Likes

@GrymmyD
All bandwidth is indeed calculated outgoing only.

I think your summation is generally correct for most games. 40k/second for most games may even be high. All games i’ve worked on in the past have used under 4k per sec per player for the core game data, in fact. The reason is because you need to provide an acceptable experience to all players regardless of their connection to the internet. A lot of the world is still on mediocre internet connections, whether they be desktop or mobile.

4k/sec is a pretty safe bet across a very large range of connections, guaranteeing your game is playable by the largest number of people.

Thanks for verifying that.

Yeah so I’m not that far off from reasonable I guess, seeing as the 40kb/sec number is just my worst case total outgoing sendrate from the server (40kb/sec / 8 players = 5kb/sec served to each client). Plus like I said, these are my extreme, unoptimized worst case scenarios of full servers where every player entity is in motion and turning quickly.