[Asset] Saving system for DOTS

It’s time to announce Saving system for DOTS in an official way. (It’s also on sale right now :slight_smile: )

Historically I was already using it in my own project for a while (for NZSpellCasting) and as that asset grew bigger and bigger, I figured I need to make a split. I already had very specific requirements (mainly UnsafeLists and custom serializer/deserializer) and I then spent nearly 2 months to add a well structured editor UI, source generators and polish as much as possible to make the tedious process of saving data as easy as possible. Big shout here to @thelebaron who was brave enough to make the first dive and helped me find a few nasty bugs and requested some great UI features.

Saving data is a very peculiar thing and I was hesitant to even release the asset at all. As time has passed from the initial release I’m confident in its quality and that this asset can help others save a lot of time and not be forced to roll your own system.

To make this post longer and more informative about the asset itself, a copy/paste from the asset store page. So here goes :smiley:

Saving system for DOTS enables you to easily save and load all your unmanaged data with an UI driven workflow.

How?

Create a scriptable object Save Format and add the types you want to save in a handcrafted UI and let the source code generator do the heavy lifting for optimized serialize and deserialization code.

Then mark subscene entities or prefabs with a Savable Authoring component and you are done. Saved data will be efficiently laid out in a binary format and compressed with LZ4.

Features

:heavy_check_mark: Create save formats in the editor UI
:heavy_check_mark: Save and load data defined in save formats
:heavy_check_mark: Automatically keep track of subscenes
:heavy_check_mark: Ignore fields in structs
:heavy_check_mark: Create migration plans with an intuitive node based approach when the save format has changed
:heavy_check_mark: Prefab migration when prefabs have changed
:heavy_check_mark: Included Savegame viewer for easy debugging
:heavy_check_mark: Full Source code included

Links: Documentation | Discord | E-Mail

Automatically keep track of subscenes

  • Automatically load previously open subscenes
  • Instantiated prefabs (from spawners, etc…)
  • Destroyed objects (trees, iron nodes, etc…)

Save and load data defined in save formats

  • global (non-subscene) and subscene entity data
  • Entity struct data with IComponentData
  • Entity buffers with IBufferElementData
  • Component enabled states with IEnableableComponentData
  • Arbitrary primitive struct data
  • Nested support for UnsafeList
  • Custom serializer/deserializer for more complex tasks
3 Likes

This is slick! I just finished writing a first pass on our saving system and am also heavily leveraging source generators. :slight_smile:

Thanks John!

Ah, you’re also mad enough to use source generators. :smiley: But no wonder knowing what else you already made.
Creating source from scriptable object data and triggering the SG on changes is not straight forward at all right now. Is that what you are also doing?
Wonder how you’ve solved it because I need to write the SO into a json file (with .cs extension) that is referenced in a csc.rsp file so the source generator picks it up. On changes I also need to add a random string (I use a random GUID) to trick Unity into reloading the SG.

I really hope this gets better in the future and with CoreCLR. It’s so unbelievably hacky but it works reliable.

This is madness :see_no_evil: How’s the performance? I suppose the source generator output is cached and shouldn’t impact the IDE/compilation performance?

Really glad to see this asset released, as it fills an important hole in the ecosystem. There’s already some saving systems out there but this one looks most feature-complete and convenient to use. I’ll give it a try over the next couple of weeks.

I have tested with 100k localTransform/localToWorld comps which took around 2.5ms if I can remember correctly (for serialization). I had to make some tough decisions to prioritize performance or stability and i took stability.

The source generator has some low hanging fruit in packing together more data in one memcpy but it turned out this is much harder to solve reliably when tricky edge cases are involved so I had to postpone it for a later update.

The saving system performance really profits from the great memory layout so in the end, writing to disk will be the most expensive call.

Regarding the SG performance. I’ve not witnessed one case where it would slow down the IDE or compilation and it’s only triggered when data related to its assembly is changed. If that doesn’t happen it’s all cached.

Thanks, that’s what I wanted to achieve with it. Saving is a really dry and boring process of game dev’ing but outside of any prototype totally essential and inevitable. And even in prototypes, it gets so tiring to not be able to save something. ^^ I’ve mainly started developing it for my ability system because I got so sick of dragging the same spell over and over again to my hotbar. I won’t get the time of dragging spells or developing the saving system but I hope I can save someone else a lot of time. :smiley:

1 Like

In my case, the save schema is authored via attributes within the C# code, so there’s no Scriptable Objects involved. The idea is that you annotate your components with attributes like:

public struct MoneyComponent {
    [Save]
    public int Coins;

    // Other non-saved fields..
    public float TimeLastGotMoney;
}

and the source generator collects all of these into a set of schema definitions, as well as a set of ECS systems that pull the correct data and wrap it up in the schema, ready for export to JSON.

The reason I’m doing it this way is that it means I can save off a copy of the C# generated schema definitions for each version of the save data, so that it’s easy to load and represent differing versions in code (and write migrations down the road). I haven’t written that part yet, though.

Ah right. Were it not for an asset I’d also have stuck to this approach. Lot less surrounding tooling involved. However, since it wasn’t, I had the problem to deal with, what if the component you want to save is in a package/code that’s not yours. Like LocalToTransform or even worse, a package where the source code isn’t available. There wasn’t really a solution besides making it all data driven and a nice side effect was as you had to already build a save format with all the components you want to save, versioning and migration got more streamlined in the process.

Ah, that makes sense. In our case, the final save schema is stilled user-defined. It looks a little like:

public struct SaveData {
    public GeneratedSaveSchema Data;
    
    // Can manually embed inaccessible data here.
    public float3 PlayerPosition;
}

So that gives an escape hatch for external data, or data that simply needs a more complex mapping than the generators provide. For example, player-placed buildings are saved out this way so it can be stored in a more suitable format.

But the generated schema handles 90% of the basic state where you just want to quickly save some data without thinking too much.

The intermediate data makes sense. I also found tying data 1:1 too much can end in nasty problems.

Out of interest. How does a “more suitable format” look like? Still an array, right?

I (maybe) had a similar problem because the data I use for hotbars and abilities is heavily relying on nested lists. My old MonoBehaviour version also used List so that was grueling to port. I still stuck with it and used NativeLists. The data itself wasn’t stored in entity chunks at first, as it’s mainly UI data where only the system that initiated an ability would need to read from it and could be stored in the system itself as private data but that also meant intermediate data between the UI that was writing, so that wasn’t great at all.

To make this more clean for saving I introduced the concept of global entity objects. Basically when you want to save a singleton that also has (nested) array data. It integrated quite nicely and data that was quite hard to store turned into something really easy. Also great for settings and similar data.

The downside is that I can only support UnsafeList and not NativeList as I can’t deal with the shifting offset of the safety fields. Maybe one day :smiley:

The player-placed buildings are individual entities in the game – they aren’t in a single list that can be saved off, and there’s also the challenge that they are identified by the tile they are placed on (ie. they don’t have a GUID identity for each building).

So there’s a collection step that happens to turn all these entities into a save structure that looks like:

"Tiles": {
    "Counter": {
      "TileGuid": "58a69b5a01f644a4e9306edd2d500aae",
      "Instances": [
        {
          "Position":[8,4],
          "Data": null
        },
        {
          "Position":[7,4],
          "Data": null
        },
        {
          "Position":[6,4],
          "Data": null
        },
        {
          "Position":[0,1],
          "Data": null
        }
      ]
    },
    "Trashcan": {
      "TileGuid": "9cac62dc8230345028193392710a8679",
      "Instances": [
        {
          "Position":[2,4],
          "Data": null
        }
      ]
    },
    "Plate Bin": {
      "TileGuid": "1e60231d39600491e85ff0564f595fc0",
      "Instances": [
        {
          "Position":[3,-1],
          "Data": null
        }
      ]
    },
    "Tomato Bin": {
      "TileGuid": "24871b3f3ae5c46c3b2a110c1708c08f",
      "Instances": [
        {
          "Position":[0,4],
          "Data": null
        }
      ]
    }
  },

The save format stores the buildings (ie. ‘Tiles’) in groups of arrays by type. For example, there is the “Counter” building type, with a TileGuid (this is the guid of the prefab to spawn when instantiating), and then a list of instance positions to spawn that tile type.

We store it this way so that the save data is easy to parse for debugging, modding, etc. It’s nice to be able to say “Ah there are 5 counters, a trashcan at this position, etc”, rather than wading through a big list of un-named guids. The grouping by type allows you to fold the types you don’t care about in a JSON editor like VSCode. You’ll also notice that we don’t need a unique GUID for each instance, because it’s implicit in the array under each type.

Some buildings have their own save data per-entity as well, which can be stored under the Data field. This is where the generated schema definitions can be used. If buildings have any components with associated save data schemas, they will be stuffed into that Data field automatically.

One downside is that the generated schema system doesn’t support DynamicBuffers. So anything more complex like an inventory is still manually mapped into the save data.

How does your Global Entity Objects system store arrays? It looks like it’s saving only individual components? Maybe I don’t understand the tutorial, haha. :sweat_smile:

Oh the tutorial is just to create such a global entity object by code.

The data itself would be defined in the save format SO and looks like this in the editor:

The workflow is just adding the InterfaceSaveGame struct and marking what you want to save. Well, more like unticking what you don’t want to save as fields are saved by default.

The struct looks like this:

public unsafe struct InterfaceSaveGame : IComponentData, ISavableObject
{
    private UnsafeList<SavedHotbarModel>* hotbars;
    private UnsafeList<ClassSpecificHotbarButtons>* classSpecificHotbarButtons;
    private UnsafeList<HotbarStanceData>* hotbarStanceData;

The interface system sets it up with:

var updateMethod = BurstCompiler.CompileFunctionPointer<OnUpdateSaveState>(SaveInterface);

var register = SystemAPI.GetSingleton<RegisterSaveObjectsSingleton>();
var (entity, tmpQuery) = register.CreateAndRegisterGlobalEntity<InterfaceSaveGame>(ref state, "interface", updateMethod);

SaveFileSystem.CreateIndividualLoadRequest(ref state, entity, "interface");

It creates and registers the entity with the singleton comp data, loads from the “interface” save file and also registers an update call. The update call is used because the InterfaceSaveGame is only used as intermediate data. UI and the game itself doesn’t use it or writes to it, very similar to your intermediate data format.

The update method is called before every save and copying some data. Essentially updating the (outdated) InterfaceSaveGame to it’s newest data and is then serialized and saved.

[BurstCompile]
public static void SaveInterface(ref SystemState state)
{
    var systemData = state.EntityManager.GetSingleton<Singleton>();
    var interfaceSaveGame = state.EntityManager.GetSingleton<InterfaceSaveGame>();
    var hotbarSingleton = state.EntityManager.GetSingleton<HotbarDataSingleton>();

    ref var list = ref interfaceSaveGame.GetOrCreateClassSpecificButtons(systemData.CurrentClassType);

    list.Clear();

    foreach (var hotbar in hotbarSingleton.Hotbars)
    {
        for (var i = 0; i < hotbar.Model.HotbarButtons.Length; i++)
        {
            var hotbarButton = hotbar.Model.HotbarButtons[i];
            list.Add(new SavedHotbarButtonModel()
            {
                CharacterId = hotbarButton.Model.CharacterId,
                HotbarId = hotbar.Model.Model.HotbarId,
                SpellId = hotbarButton.Model.SpellId,
                IndexInHotbar = i
            });
        }
    }
}

Cool! I haven’t messed yet with the Burst compiled function pointers. It looks like a clever way to get some form of indirection.

And the fact that you’ve gotten source to generate reliably off of a ScriptableObject is wild. Actually, how are you reading from the SO file within the source generator? There’s no Unity libraries there, so are you parsing it manually?

I have outlined it in my previous post

I need to write the SO into a json file (with .cs extension) that is referenced in a csc.rsp file so the source generator picks it up. On changes I also need to add a random string (I use a random GUID) to trick Unity into reloading the SG.

What I didn’t mention. The json data is written into the cs file as comment. The SG then parses the comment, extracts the json and deserializes the data. I’m using the Newtonsoft.Json library. Was a total pain to add it. :smiley:

I guess I was more wondering how you parse the YAML ScriptableObject into json. Ah, or do you do that outside the source generator?

Yeah, the json is written out from the editor. I was planning to directly read the source SO asset but the csc.rsp additionalfile only supports .cs files reliably right now. That’s why I mentioned the roundabout way of putting the json data in comments in a .cs file. Can’t even add a .json file. It’s only working in rare cases otherwise. ^^

2 Likes

Hi.
I’m interested in this asset, but I’m afraid to purchase it at full price. I already bought other two similar assets that proven to be a direct SHift+Delete. I don’t understand why Unity doesn’t allow for money back as long as they do not have any kind of requirements for assets to be updated…

Anyway. Can you tell me please when the asset will be on a sale? To see if it’s worth waiting for it.

Thank you

I can’t tell, that’s for the Asset Store team to decide. Considering there was a sale a few weeks ago I doubt there will be one in the near future.

You mentioned 2 other (similar) assets. You sure those were specifically for DOTS? I can only find one myself.

Quite the coincidence and good news. I just got an invitation for another 50% sale happening around January 31st to February 14th.

Feel free to add TPC for DOTS as well. I did pick up this one on the last sale. One less thing I have to re-implement using DOTS.