Hello!
Firstly I’m quite new to the LLAPI so apoligies if I’m missing something obvious.
I’m hosting a game locally and connecting just one client. Currently I’m not sending any data so I figured this would be a good time to find out what the ‘baseline’ Bytes per second is. I calculate this rather simply,
int TotalBytes()
{
byte error;
return NetworkTransport.GetOutgoingFullBytesCountForHost(socketID, out error);
}
IEnumerator ByteSecondCounter()
{
int lastTotalBytes = TotalBytes();
while (true)
{
yield return new WaitForSecondsRealtime(1);
bytesPerSecond = TotalBytes() - lastTotalBytes;
lastTotalBytes = TotalBytes();
}
}
Currently what I’m seeing is,
Host: 54 B/s
Client: 150 B/s
Using wireshark I can see that twice every second the host and client send each other 27 bytes of data. I assume this is some sort of connection polling. That accounts for the 54 b/s I see on the host. But not the additional 96 bytes (48 per message) on the client. Am I missing something?
I should mention I tried every variant of GetOutgoingFullBytesCount and saw no difference.
What concerns me is that this is not only baseline but an overhead on the client,
When sending a simple 18 byte test message over an unreliable channel, I see the host jump to 72 B/s (This is correct) where as the client jumps to 216 B/s (48 additional bytes, this is consistent with the baseline overhead per message)
So where is this additional 48 bytes coming from?
Thanks for any help,
Myles