Hi,
another customer reported this issue to us two weeks ago. The fix is being worked on, but it is not trivial and will definitely not get backported to 5.3.
The reason this bug happens is that let’s say you have class like so:
class MyClass : MonoBehaviour
{
#if UNITY_EDITOR
int myInt1 = 16;
#endif
string myString2 = "Hello";
}
Then, for the editor, serialization data for this class would be:
0x10, 0x00, 0x00, 0x00, 0x06, 0x00, 0x00, 0x00, 0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x00
However, for the player, it actually should be this:
0x06, 0x00, 0x00, 0x00, 0x48, 0x65, 0x6C, 0x6C, 0x6F, 0x00
There is a bug in our asset bundle build pipeline that we look at the wrong fields, so we serialize wrong data. We look at the fields for when you run the game in the play mode, as opposed to the fields that would get built when you build the player.
At runtime, it will try to first read the string length, so it reads it as 16 and then it tries to read the string from the following 16 bytes. However, there aren’t actually that many bytes left. So we crash. The reason this doesn’t happen on other platforms is that the .NET scripting backend (which is the default on UWP) depends on data being correct, and does no validation whether it is deserializing the correct field.
Unfortunately, there is no easy way to find out the culprit without debugging when you crash. The best way to find them I can suggest would be to open scripts VS solution, press CTRL + SHIFT + F, then look for “#if UNITY_” in the whole solution and finally go through all the instances that it found and remove those (or add “|| ENABLE_DOTNET”) that guard serializable fields.