On the client scene objects seem to have about a 50% chance of working vs being disabled, anyone know why?
EDIT: docs
https://docs.unity3d.com/Manual/UNetSceneObjects.html
EDIT2:
does anyone know how scene object netids are generated? How do client and server agree on what game objects to synchronise?
Short answer - server initializes netId’s and sends them to a clients, where they are applied to the objects in the scene.
Tips:
0. Make sure all NetworkIdentities are enabled before scene loads. NetworkIdentities manage being enabled/disabled on their own.
If it’s disabled on scene load - it doesn’t exists. Because you know. Networking is hard, doing checks are hard and design is a time waste.
- Make sure server and client have the same amount of NetworkIdentity components in the scene;
- Make sure server and client scripts have the same amount of:
- SyncVar’s
- Cmd’s
- ClientRpc’s
- SyncLists
- Make sure server and client have the same SyncVar structs structure. Otherwise this will cause failure whole scene payload deserialization failure.
- Lastly - sometimes UnetWeaver fails, due to Editor Serialization, creating multiple instances of classes, which contain multiple SyncVar’s… Yeah rule #2 is now screwed. What to do in such case? Full reimport\Library deletion + rebuild usually helps. If not - you’ve probably missed something.
Have fun.
But how does the client decide what game object gets what netid, is it hierarchy position or a uuid?
EDIT:
I forgot to mention it works 100% when server and client are on the same machine, it only fails when run on a remote server.
As far as I remember - it’s assetId and sceneId that determines it