Synchronization of sounds and effects for entities in multiplayer

Hi everyone!

We’re developing a game with two modes: single-player and multiplayer. To keep it brief, here’s a quick description: the player(s) spawn in an arena where enemies (monsters) start spawning around him (them). The player controls a hero with abilities to defeat these enemies. The gameplay is enhanced with various sound effects and VFX. At any given moment, there are about 50–100 enemies on the scene alongside the hero.

Currently, sounds and effects are implemented as components added to entities. For instance, if the damage system needs to play a sound, it writes the sound ID and other necessary data into a ghost-synced component of the entity. A client-side system then reads this data and plays or stops the sounds accordingly.

Problems arise, for example, when trying to play a sound for an enemy’s death. I can’t store the sound on the enemy entity itself, as the entity will be destroyed, taking all its associated data with it. I also can’t “extend” the life of the entity just to ensure the client-side sound system has time to process it, because other systems will also process the entity during the update cycle, potentially leading to numerous bugs.

My question:
How can I properly organize “state” storage and ensure accurate data transmission between the client and server for effects or sounds? I’m looking for a general architectural solution that has been successfully applied to similar problems.

Potential solutions I’ve considered:

  1. Creating clone entities that “follow” the main entities throughout their lifecycle and are destroyed at the appropriate time. (I dislike this option because it involves numerous structural changes and increases the total number of entities significantly.)
  2. Using RPC commands to handle effects and sounds. (However, the documentation suggests RPCs aren’t intended for this purpose, and triggering 30–50 RPC commands per second doesn’t seem like a good idea.)

A small list of requirements for the system:

  • It should handle sounds and effects triggered at entity creation, during the entity’s lifecycle, and immediately after the entity is destroyed.
  • It must support “coordinate synchronization”: effects or 3D sounds should appear at the entity’s position.
  • It should work correctly in both single-player and multiplayer modes.

Any advice or examples of architecture that effectively solve these challenges would be greatly appreciated. Thanks in advance!

I hope that technically it’s just one mode. :wink:

Your singleplayer should be a hosted game, with just the host playing, and the transport configured to use address 127.0.0.1:0 to ensure no outside connections are accepted. Then you only need to maintain a single codebase since everything that works networked in a client-hosted game will by definition work the same way in a host-only game.

This is a common misconception. An actor’s “death” doesn’t actually end its life.
Death != Destroy() :wink:

Well, it normally would in the real world, hence why so many just don’t see this but … your actor should have a life beyond death. It merely changes state. The state progression goes like this: spawn - alive - death - corpse - decay - destroy / return to pool.

Consider a regular FPS game enemy. You shoot an actor until its hitpoints fall to or below zero, then:

  • the enemy enters his death state
    • this may involve playing a death animation or turning the avatar into a ragdoll (disable Animator)
    • the enemy stops performing its “alive” logic (eg stop shooting etc)
  • the corpse is then lying on the floor for some time
  • then the corpse starts to decay, most commonly it will simply start to move downwards through the floor for a few seconds or fade to full transparency
  • after the decay time is elapsed, the enemy is either destroyed or put back in the pool with possible reset of state

All throughout this time, the GameObject or Entity will not be destroyed or removed from the simulation!

You only need to engineer GameObject components in such a way that they can be enabled/disabled or you’d add or remove an Entities’ Alive/Dead tag component, or also quite commonly, you’d simply remove the “Hitpoints” component from the Entity since an Entity without hitpoints by definition has to be considered dead.

The “loss of data” you mention is another reason to keep your enemy alive, albeit dead. Sounds weird, I know, but you’ll get it in no time. :upside_down_face:
The numerous bugs you expect is something you’ll have to consider and if necessary, refactor the code.
A transfer of components to another Entity would also work but I only recommend this if you absolutely want to avoid refactoring as little code as possible (eg upcoming release, if not already released). This however introduces extra housekeeping efforts.

The only alternative, just considering the audio, that I would consider is to have an event system of sorts.

The pattern goes like this:

  • server determines that actor A must die
  • this change of state is synchronized to clients
  • when a client processes the death of actor A, it invokes an event “ActorDied” - I suppose this will happen in the GameObject world.
  • the actor is referenced in the event and should still be valid at this point - the audio system can now pick up the event, get the sound(s) from the actor as well as the actor position, and queue both for playback
  • Actor gets destroyed but the data necessary to play sound is already copied into the sound system
2 Likes

Thank you for your response, CodeSmile!

I had a similar idea at some point, but I think I dismissed it because I thought it would require modifying too many systems and potentially “break” them.

Now, thanks to you, I realize that this solution is actually quite logical and straightforward. By properly disabling the key components of the entity (without deleting the entity itself), the code in other systems should remain unaffected (“dead” entities won’t fit archetypes for systems’ jobs).

1 Like