So, my current client/server approach is to use sockets, C# 4.0 on the server side. I create a packet of information with a packet type (player movement, player get item, etc) which consists of one byte.
What follows is the serialized information for the actual data/parameters. I serialize and deserialize the information on my own, using the BinaryWriter and BinaryReader.
My initial effort has consisted of something like this:
Client → Server: Server_PickupItem, Id XXXX
Server does the processing on its end. Verifies the player actually CAN pick the item up, transfers ownership, etc.
Server → Client: Client_PickUpItem, Id xxxx
Client does its thing…the visual change of the player picking up the item.
My question:
This works fine…so far. My mind was wandering, and thinking about optimizations. The actual pickup item data may be small…like 5 bytes, and sending these out immediately is a waste…that’s 5 bytes each way PLUS the TCP/IP packet overhead. Why not queue up the requests for a player, and only send them out every X milliseconds…reducing bandwidth? So every 100 milliseconds, I send up all of the requests queued up. For a small number of players, I don’t see it making much difference.
For a LOT of players, I think it could be huge.
I realized this sounded familiar…this is what the Nagles algorithm does, apparently.
So, my question is…would I just be duplicating the Nagles stuff by doing this on my end? Each player would have a dedicated queue of packet data on the server. If it sounds like a good idea, any suggestions on implementation?
I was thinking of running this on a separate thread, where it just continually checks all player data queued up to see if it’s time to pump it out. I guess one downside would be…it would add 100 milliseconds or so to latency.
Any thoughts or suggestions on this would be very welcome.