I’m attempting to use SevenZip for compression in a Unity 5 game that I’m working on. It works fantastically so far, however, when I attempt to compile the game for WebGL (HTML5), I keep getting a Data Error Exception when it runs. Initially, I thought there might be something wrong with the data file itself, so I made a really small project and reviewed the contents of the file, the arrangement of the bytes, and so on. This review proved that the file itself was not at fault.
When Unity creates a WebGL build, it converts just about everything into JavaScript ASM. I suspect that there might be something in the code that the converter was not expected or wrote in the wrong logical order. I will eventually figure out what’s wrong and how to fix it, but I would absolutely love it if I could get some additional heads looking into this issue.
Attached is the sample project in question. If you run the project in the Unity editor (you may have to load the Test scene first) and look at the console output, there are results of the test as they are expected. They should all pass. If you compile the project to WebGL and then run that build in a browser, then you’ll see the Data Error Exception print out when the bytes are supposed to be decompressed. (directly after the debug and right before the tests)
LZMA Compression Sample - Unity 5 Project
Alternatively, if I uncheck the box on the Main Camera object that says “Test Compressed Resource File”, the uncompressed version of the file works without issue under all cases. The problem must lie with the SevenZip decoder and its compatibility with Unity’s JavaScript ASM.
NOTE: I’m using LogError so that the console output can be seen in the HTML 5 build without having to dig for it.