Hey all,
I’m having a very peculiar issue with deserialization of MonoBehaviours/ScriptableObjects loaded from asset bundles.
I’m modding a game (i.e. I don’t have the game sources) by loading asset bundles which are compiled from a separate Unity project. The objects are referencing scripts which are compiled separately in Visual Studio into a dll, which is referenced by both the Unity project and the game. Generally all works fine, unless there is a script that has a serializable field of a Type defined in the same dll.
For simplicity, let’s say I have a “Scripts.dll” with the following 2 classes in it:
[CreateAssetMenu(menuName = "ScriptableObjectTest")]
public class ScriptableObjectTest : ScriptableObject, ISerializationCallbackReceiver
{
public string Name;
public int Value;
public Vector3 Vector;
public TestInternalClass InternalClass = new TestInternalClass();
public void OnBeforeSerialize()
{
MonoBehaviour.print("OnBeforeSerialize: ScriptableObjectTest");
}
public void OnAfterDeserialize()
{
MonoBehaviour.print("OnAfterDeserialize: ScriptableObjectTest");
}
}
[Serializable]
public class TestInternalClass : ISerializationCallbackReceiver
{
public string Name;
public int Value;
public Vector3 Vector;
public void OnBeforeSerialize()
{
MonoBehaviour.print("OnBeforeSerialize: TestInternalClass");
}
public void OnAfterDeserialize()
{
MonoBehaviour.print("OnAfterDeserialize: TestInternalClass");
}
}
In the Unity project I add a ScriptableObjectTest asset and fill in some random data in the inspector.
Then, for simplicity, let’s say I have a startup scene in the game which has a single object with the following script:
public class TestScript : MonoBehaviour
{
private void Awake()
{
AssetBundle assetBundle = AssetBundle.LoadFromFile(@"path\to\the\asset\bundle");
ScriptableObjectTest test = assetBundle.LoadAsset<ScriptableObjectTest>("ScriptableObjectTest");
print(test.ToString());
}
}
If I run such scene from the editor, or build the player and run the executable, I have the correct output in the log (all of the fields have the values I have assigned in the inspector). This even works if I create a new Unity project and load the asset from it.
But whenever I run it from the modded game, only the parent class and its immediate fields of the primitive or embedded Unity’s Types are deserialized correctly, while the InternalClass always has its default value. For testing purposes I have added the ISerializationCallbackReceiver interfaces to both classes, and whenever the asset is loaded from this game, the TestInternalClass never even gets a deserializer pass (i.e. it’s OnAfterDeserialize() method is never called).
I have tried to match the exact version of Unity with the game (it runs on v5.5.4p1), various AssetBundle build options, nothing seems to fix it, and I’m out of ideas of what may be the cause of this.