Handling change in Class variables with saving ScriptableObject

I’ve been playing with saving Scriptable Objects using binary serialization. This is the loading from file code:

public static T ReadFromBinaryFile<T>(string filePath)
{
    using (Stream stream = File.Open(filePath, FileMode.Open))
    {
        var binaryFormatter = new System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
        return (T)binaryFormatter.Deserialize(stream);
    }
}

Then using this class (short version):

using UnityEngine;
public class Item : ScriptableObject 
{
    [SerializeField]
    int value;
}

My apprehension in using this for things like my Item database, is that I don’t see how I could implement an additional variable to my classes and then still be able to load the old data and save with the new variables. When I change this class to add, for example, a float durability;, this doesn’t match the saved file anymore and this has so far just wiped my database. Is there a standard way to solve this?

So far I’ve been trying out adding a version number to a save and comparing the different classes, but then I get into having to write extensive ‘conversion scripts’ which put the data from the old class versions into new ones and all this also means I need to keep copies of my old classes around etc. Which I’d really prefer not to.

If you are using ScriptableObjects to define your items, then you should just save them as assets in your project, and your database should just be another ScriptableObject asset in your project that serializes references to them. (You can also use AssetDatabase.AddObjectToAsset() to make the items children of the database asset, if it makes them easier to manage.)

It would look something like this (note you can of course just make your serialized fields public if you want to go that route and trust all consumers of your class):

// in Item.cs
using UnityEngine;

[CreateAssetMenu]
public class Item : ScriptableObject {
    public int Value { get { return m_Value; } set { m_Value = value; } }
    [SerializeField]
    private int m_Value;

    public float Weight { get { return m_Weight; } set { m_Weight = value; } }
    [SerializeField]
    private float m_Weight;

    // and so on...
}

// in ItemDatabase.cs
using System.Collections.Generic;
using UnityEngine;

[CreateAssetMenu]
public class ItemDatabase : ScriptableObject {
    [SerializeField]
    private List<Item> m_Items = new List<Item>();

    public int Count { get { return m_Items.Count; } }
    public Item this[int index] { get { return m_Items[index]; } }

    // and so on...
}

The whole point of ScriptableObjects is basically to allow you to serialize and save persistent, arbitrary data in your project, so the extra step of dumping the data into another format—whether binary or JSON—is unnecessary. Another benefit of this approach is that you can select multiple ScriptableObjects at once in the Editor and edit them at the same time. As I noted in my comment above, adding a new field then simply becomes a matter of applying a default value in the field initializer, and you can then quickly edit several items together when you need to apply different values to them.

jsonutility is just a serialiser that works nicely for converting a dataset into a big string, and then you use a filewriter to send that string to a file… your data structure (the item database) should be entirely separate to this process. FileIO and object manipulation shouldn’t live in the same class all but the most tiny projects…

If you use the jsonutility, then growing the structure over time is not even a thing to worry about. The api will do all the heavy lifting for you and you can just worry about your datastructure doing what you want it to do. .

this is the entire datastructure i use for a full application that is saved to file by the user, and allows drawing lines with various colours and thickness, making user notes on their drawing, placing any number of 3d objects in 3d space, and a few other things. that gets churned into json for saving, but its i can be a 10,000 item datastructure without any effor or slowdown in processing

 [Serializable]
    public class DataStructures
    {
        [Serializable]
        public class DataSet
        {
            public List<Item> items = new List<Item>();
           // public List<Item> lines = new List<Item>();
            //public List<Item> nodeConnections = new List<Item>();

            public DataSet()
            {
            }
        }

        [Serializable]
        public class BasicStruct
        {
            public string id;
        }

        [Serializable]
        public class Item : BasicStruct
        {
            public string prefabReference;
            public string value;
            public Vector3 position;
            public Quaternion rotation;
            public Vector3 point1;
            public Vector3 point2;
            public Color color;
            public float width;
            public List<string> links = new List<string>();

            public Item()
            {
                links = new List<string>();
                position = Vector3.zero;
                rotation = Quaternion.Euler(Vector3.up);
            }
        }
    }

and then i use this to write to a file

  string jsonfile = JsonUtility.ToJson(datset); //

  using (StreamWriter writer = new StreamWriter(filepath, false))
  {
                writer.Write(jsonfile );
  }