Content-Encoding: gzip, but console says otherwise

Hello Unity Forums, I tried uploading my gzipped WebGL build, and it gives me the below error, even though I copied the code from the section “Server configuration for compressed WebGL builds without decompression fallback (IIS)” into my “/Build/web.config” folder from this source: Unity - Manual: WebGL: Server configuration code samples

This is despite that the headers of Build/Checkpoint.framework.js.gz includes Content-Encoding gzip and Content-Type: application/javascript.

The file is hosted here:
https://as-unitygame.azurewebsites.net/Build/Checkpoint.framework.js.gz

WEBGL_debug_renderer_info is deprecated in Firefox and will be removed. Please use RENDERER. Checkpoint.loader.js:1:9894
Uncaught SyntaxError: illegal character U+001F
Checkpoint.framework.js.gz:1
Unable to parse Build/Checkpoint.framework.js.gz! This can happen if build compression was enabled but web server hosting the content was misconfigured to not serve the file with HTTP Response Header "Content-Encoding: gzip" present. Check browser Console and Devtools Network tab to debug. Checkpoint.loader.js:1:162

Any input would be great because I can’t think of a solution.

1 Like

Did you check the browser Console as the error message says?
Most likely it is what it says: gzip not enabled on the server.

Thank you for the input, but it says “misconfigured to not serve the file with HTTP Response Header “Content-Encoding: gzip” present.”, not that gzip doesn’t work at all. And wouldn’t there be errors for data and wasm files if so, since they’re also gzipped

The server still needs to send that content encoding in the response header, until then, it isn’t fully configured to support gzip. :wink:

8539034--1140674--upload_2022-10-25_19-26-32.png

But it does say the content-encoding is gzip?

1 Like

I ended up resolving the issue. Because I was testing with and without gzip I made several builds, and had files with gzip (Build/Checkpoint.framework.js.gz) and without gzip (Build/Checkpoint.framework.js), and this was causing a conflict. Once I deleted the one without .gz, it worked properly. So if anyone else is having this issue, make sure to only have the files from the last build, whether it be brotli, gzip, or uncompressed, because otherwise you will likely face issues.

Thank you @CodeSmile for your input, even though it didn’t end up fixing the issue haha. I’m sure it’ll be useful for someone else.

1 Like

Wondering why Unity is able to even export builds as gzip compressed if like 0 hosting servers support it by default? How the heck are you supposed to get an Amazon s3 hoster to support gzip so you can use a build made with gzip compression? And why is none of this explained anywhere, everyone just says to “build it uncompressed” without explaining why the gzip method doesn’t work

In the Unity docs there’s a whole page about ensuring you configure your server correctly.

What you see on the forums is Unity-Devs with 0 backend-web experience building for WebGL and complaining. It is not representative of the devs who actually know what it means to export to web and have done their due diligence.

This is a Unity forums, not a web server configuration stackoverflow support group. That’s why so little support is given on that front, it’s not the right place for it.

In a way I disagree that there sholdnt be a webserver config section… because unity seems to want something nothing else does… and doesnt seem to be natural… Most servers sending a .gz file mark it as gzip but no, unity wants you to claim its javascript but in gzip format… why? its just making it complicated

So, I have a site i put my playable games on, and almost every version of unity seems to need to tweak something to make it work, so… heres me, i uploaded a 2d game im helping a friend with… now, a 3d one in the same version worked fine on my pc… but this 2d one which can and should work just fine on my iphone? nope, didnt even work on the pc… even though also set to gzip compression like the rest… so I tried without compression, nope because then it couldnt get the wasm file sync or async… even though its there… so back to gzip… poked and poked and I finally got it working but it does make me wonder why unity needs such annoyingly specific settings and that it doesnt feel consistent and as @firestorm185 pointed out - means 0 webservers out the can seem to be able to run them without being reconfigured… and not everyone wants to become an apache genius etc…

As for my app, i got it working in safari, but it doesnt work in chrome… it says something about recursion… I also dont want to recompile the other stuff just so they are all on the same version of unity, some of those games are 2-3 years old, and dont need touching, they maybe crap but they are what they are… and i dont want to redo the whole site each time just because something else has changed…

@bugfinders Have you tried publishing your game to Unity Play using the WebGL Publisher asset from the Asset Store? That‘s the most straightforward way to test if an issue is due to webserver config or just a bug within the app or a specific browser.

Um - nope, tbh I hadnt considered using their services as they often start encurring charges and the like - i hadnt seen how to use it (I hadnt looked to be fair)

One of the primary performance challenges with WebGL projects is reducing the size of the generated build output which needs to be downloaded over a network connection which can be large. There have been multiple studies which directly attribute lack of user engagement with long download times so it’s imperative that this is optimized and file compression (e.g. Gzip or Brotli) is one of the most effective methods for achieving this. I agree that it’s suboptimal in terms of developer experience because it requires server configurations which is annoying if you’re not familiar with such technologies.

It is appreciated that its recognised its a hard choice, but the settings it wants atm seems a little unusual…

Our project heavy files are stored on a CDN. So a CDN won’t change any headers. If the file is compressed, it will be sent with a gzip header, which in the case of a js file automatically blocked by CORB policy. Normally Apache, Nginx, Haproxy and etc. web services do it “out of box” without any problem with CORB and reading js files on client.
Unity does not have to compress .js files. There are already working solutions that have been used over the years. But Unity comes with its own bicycle that doesn’t even work as it should. The only solution from now on is to turn off compression.


Update.
One of our teammate contacted with an other company. It appears that those company handled this unwelcome behaviour since Unity 20/21 as they said. So they made same as we are: they switched off compression for all theirs projects.

I don’t know how do you check this kind of improvements for other teams.But If you don’t I suppose that you have to start test it somehow. Because several years of behaviour that community handles by themselves doesn’t looks like something good.

All these years, we have been using Unity 2019.4 until now. At least we decided to migrate to a new LTS version and we didn’t expect to be stuck with this kind of issue.

2 Likes

I don’t understand your challenges @GreenTVlad_1 and @bugfinders . We use azure, CDNs, storage blobs and dynamic servers. Full devops, automatic deployment, the whole thing. Integration in React, NextJS and latest Unity LTS.

No issues whatsoever to serve brotli compression via all of it. Just get it done, find yourself someone who knows cloud if you want to deliver through cloud.

Sorry for being a bit harsh but if you are expecting to get Cloud/Server Config Support I don’t think this is the place. You may be lucky and find someone to hand hold you but it shouldn’t be your expectation.

no, my point is it is not standard. If it was standard it wouldnt need to be configured specifically… if the js is compressed then if it was named blah.js.gz then it would be served as gzipped, but its not… they need it to announce its gzipped when the file name says it should be plain text…

as a result, it can be annoying to fit it in with other things. Im more than capable but for example, i made a 2d project despite having 2d and 3d projects on my site and uploaded it and it moaned like hell, when for no obvious reason it should have been fine… yes, its working now, so clearly i sorted it, but, it was an uphill struggle that shouldnt seem needed.

2 Likes

My point is that there is NO reason to compress .js files because webservers can do it out of the box. All other files, ok, it’s a good idea. But what is the purpose of this approach? We can easily do .js compression (gzip, brotli even change it for different clients on the fly :smile:) on the web server’s side, and we will get the same result without crunches.

1 Like

The reason why Unity does a seemingly nonstandard thing is that WebAssembly-based content is a unique hosting case compared to other types of web sites.

Traditionally web sites would either have small html/js/image files, which would be configured to utilize a on-demand gzip compressor cache on the web server. Web browsers are supposed to cache these.

Also traditionally web sites might have large .zip/.tar.gz/etc. files that would be downloadable applications intended for the user to download an uncompress when they wish. E.g. downloading a program installer, or a large packaged bundle of documents. The intention is that web browsers should not cache or uncompress these, since the user would be typically downloading these files just once, and want to have the file they get to be compressed on disk in the original data bits (e.g. for SHA1 checksum verification).

My understanding is that the above two use cases are what was referred to as the “industry standard” in earlier post.

However, WebAssembly and Unity data content do not fall into either of the above two categories. But the scheme that Unity employs is equally as much industry standard - the precompressed configuration that is employed is exactly like was intended by the web specifications that provide the Content-Encoding and Content-Type headers.

Using a web server’s on-demand gzip compressor cache is known to be prohibitive, since these on-demand compressors need to be fast, so only have the luxury to use fast streaming compression settings - whereas the offline compression employed by Unity at project build time can be of the “max compression” (level 9 quality setting) sort. Utilizing a on-demand compressor cache for > 10MB large data files would place an unreasonable CPU burden on the web servers, which typically don’t have the fastest CPUs available. Also on-demand compression comes with max-age, max-size settings that would require tuning from the users, so leaning on such solutions is not automatic either.

However, if you so wish, you can fully utilize the on-demand compression caching scheme by just building an Uncompressed build from Unity! That gives you the data in such a format that is ready to be consumed by such a on-demand compressing server. There is nothing preventing this use case.

To get the best possible performance for users who do not want to rely on suboptimal on-demand compression caching, Unity provides the preferred option to precompress the content on disk.

This solution is different from the solution that web servers traditionally employ to serve static downloadable zip files (such as those Cinebench.zip as an example) since it is the intention that the browser should transparently decompress the Unity data files - unlike is the case with the precompressed binary zips.

Also, unlike downloadable binary zips, the Unity data files are desired to be cached, since players often repeatedly visit a game site to play.

This is why the best practices of serving Unity content includes precompressing it on disk ahead of time (for maximum compression factor and zero CPU overhead on the server), but declaring to the browser that the content should be uncompressed for serving.

The rationale of including .js files also in this scheme, is that we can ask then web hosts to only have to configure one type of scheme. If we mixed precompressed content and on-demand caching content, we would have to ask web admins to configure both mechanisms into their web servers: they’d still need to set up Content-Encoding headers, and in addition, they’d have to make sure their on-demand compression caches are working the best possible.

It is simpler to just expect the web admins to have to set up one or the other, and not always both.

Using Unity precompressed files? → set up Content-Encoding and Content-Type headers
Using uncompressed on-demand caching? → set up your on-demand caching subsystem

1 Like

The file is hosted here: https://as-unitygame.azurewebsites.net/Build/Checkpoint.framework.js.gz

Looks like the build is working now nicely. Great!

@KarimAbdelHamid How did you solve the issue and keep GZIP compression?

Despite the web server configuration code samples and other claims people alude to above, Azure storage doesn’t seem to provide any such configuration options for the headers. The closest is to manually add “gzip” to the encoding field of every .gz file and then I had to do this additional workaround:

I decompressed the build.framework.js.gz file uploaded that and modified the HTML code so it would load the uncompressed version instead. eg: /build.framework.js

Now my game loads and executes instead of halting completely at 90% with console errors about gzip headers.

So, @jukka_j , I think the above questions are still valid; Why compress the framework js file, which only saves a whopping 360 KB if it won’t work and we have to resort to manually altering the build files? (Compressing the other files of course makes sense; I’m getting a reduction of 45 MB)*

*Using 2021.3.32

However, using Brotli, I was able to make it work and files are even smaller.