[Unet] Did not find target for sync message for 1. Delayed Spawn information

Sometimes in my client-server application I have “Did not find target for sync message for 1” Warning. I have server with OnlineSpectator (NetworkBehavior) object which is common and singular on all clients and server. Once the client is connected, I want SyncVar “time” to be synchronized. I am aware, that warning appears because OnlineSpectator is disabled on Client, but SyncVar is trying to synchronize. Server is sending to client information to spawn OnlineSpectator but it is delivered after few sync messages.
Logs here:

I use Reliable Fragmented channel (0) (because of big data messages) and HLAPI. Can I assure correct order of this messages? It is unacceptable to sync SyncVars ans SyncLits before being sure of OnlineSpectator existance on client.

I will be grateful for your help! :slight_smile:

What I discovered is that DefauldReliable channel (in Channels class) is 0 and is used for state updates and spawning.

Channels class:
The id of the default reliable channel used by the UNet HLAPI, This channel is used for state updates and spawning.

public const int DefaultReliable = 0;

Because of it, to get oredered messages we have to set Channel 0 to Reliable Sequenced. Spawn message is now always before sync messages and error won’t appear.

But it provides another problem. I was using channel 0 as Reliable Fragmented due to big data in message. I want to send this message OnConnect and using connection Id only. I use NetworkServer.SendToClient, but it uses channel 0 only. Can I bypass it somehow?

Messages ignores [NetworkSettings(channel = 1)] attribute :frowning:

Experienced the same problem. Seems to be caused by different errors, however I solved it by setting the inspector to “Debug”-mode and checking the Network Identity of the prefab that the server was spawning.

It had Scene Id set to 1 where every other prefab had it set to 0. I changed it to 0 and everything worked fine. Not sure if this is the case for everyone that are experiencing this problem though.