Custom serialization: how much is too much?

Hi,

So I have read 2 sides of arguments:

I am and do want to use ScriptableObject for many things (shared static data, custom enum, delegate objects). But after some development I realize:

  • I am serializing dictionary because it’s so useful in editor, but I have no idea what performance hit I am taking during runtime.

  • I want to serialize interface too, as it’s quite clean to use interfaces as API contracts**:** imagine a UI script that take an arbitrary list of items and display them. Instead of saying “This UI can take a list of people”, we say “This UI can take any list which offers an API to return icons and names”.

The problem being: while designers get a powerful editor due to custom serialization, it seems players will take a performance hit.
- But in what ways?
- Anything besides OnAfterDeserialize()?
- Is serializing dictionary (for ScriptableObject asset) ok?
- Is serializing interfaces (for MonoBehavior field and delegate object) going too far?

1 Like

There are many different approaches to serialization, and I’d be glad to discuss ones I’ve encountered if you’d like.

But one common approach to solving the perf problem is to move your serialization work to a separate thread.

The serialization may take longer to complete (usually by just a few frames), and you may not be able to use Unity’s built-in serializer, but it’s a solid and battle-tested way to solve that problem.

Please let me know if you’d like more info. :slight_smile:

Thx! I see it’s a compromise between performance, modularity and designer-friendliness.

But may I ask: does all runtime performance penalty took place inside OnAfterDeserialize()?

AKA: can I be certain that, if I keep the processing in OnAfterDeserialize() lean and fast, runtime performance won’t be much of an issue?

Take a simple example: a serializable dictionary is often simply serialized as an array of keys and values, and on deserialize, we loop over arrays and add them back to the dictionary.

Since we expect the size of these dictionaries to be small (avg case < 20, worst case < 1000), we did not bother to use another thread.

But doing the same serialization to “any interface” is much more complex. I don’t think the gain in modularity justifies the performance penalty.

So my current thoughts:

  • I have looked at VFW and used Odin, and both of them give me a bad taste about complex custom serialization, even with their optimization.

  • I think while custom serialization is very useful, self-restraint is also important. In the “interface” case I mentioned, maybe a “duplicate UI MonoBehaviour” is better than some elaborate setup to “make serialization support inheritance”.

Also another trick I try is:

        #if UNITY_EDITOR
        [SerializeField]
        [ReadOnlyField]
        #else
        [NonSerialized]
        #endif
        private EventRegistryDictionary _registry = new EventRegistryDictionary();

EventRegistryDictionary is a serializable dictionary, ReadOnlyField is a custom attribute to prevent editing the serialized field, the class containing it is a ScriptableObject (save as asset).

In order to debug a dictionary data structure at design time, we want to serialize it for inspection.

But at runtime, we don’t want serialization at all for performance reasons.

You cannot be certain of this, no. Even if you were ignoring OnAfterDeserialize(), and only serializing/deserializing values types, serialization itself can be slow. The speed will also be dependent on the number of values you’re serializing. And the io part (actually storing your serialized data on disc) can be very slow, depending on your user’s drive speed (I’m assuming your not serializing for networking purposes).

Best to keep your OnAfterDeserialization() code as performant as possible, and then also do perf testing on the actual usage case in game.