The reason why Unity does a seemingly nonstandard thing is that WebAssembly-based content is a unique hosting case compared to other types of web sites.
Traditionally web sites would either have small html/js/image files, which would be configured to utilize a on-demand gzip compressor cache on the web server. Web browsers are supposed to cache these.
Also traditionally web sites might have large .zip/.tar.gz/etc. files that would be downloadable applications intended for the user to download an uncompress when they wish. E.g. downloading a program installer, or a large packaged bundle of documents. The intention is that web browsers should not cache or uncompress these, since the user would be typically downloading these files just once, and want to have the file they get to be compressed on disk in the original data bits (e.g. for SHA1 checksum verification).
My understanding is that the above two use cases are what was referred to as the “industry standard” in earlier post.
However, WebAssembly and Unity data content do not fall into either of the above two categories. But the scheme that Unity employs is equally as much industry standard - the precompressed configuration that is employed is exactly like was intended by the web specifications that provide the Content-Encoding and Content-Type headers.
Using a web server’s on-demand gzip compressor cache is known to be prohibitive, since these on-demand compressors need to be fast, so only have the luxury to use fast streaming compression settings - whereas the offline compression employed by Unity at project build time can be of the “max compression” (level 9 quality setting) sort. Utilizing a on-demand compressor cache for > 10MB large data files would place an unreasonable CPU burden on the web servers, which typically don’t have the fastest CPUs available. Also on-demand compression comes with max-age, max-size settings that would require tuning from the users, so leaning on such solutions is not automatic either.
However, if you so wish, you can fully utilize the on-demand compression caching scheme by just building an Uncompressed build from Unity! That gives you the data in such a format that is ready to be consumed by such a on-demand compressing server. There is nothing preventing this use case.
To get the best possible performance for users who do not want to rely on suboptimal on-demand compression caching, Unity provides the preferred option to precompress the content on disk.
This solution is different from the solution that web servers traditionally employ to serve static downloadable zip files (such as those Cinebench.zip as an example) since it is the intention that the browser should transparently decompress the Unity data files - unlike is the case with the precompressed binary zips.
Also, unlike downloadable binary zips, the Unity data files are desired to be cached, since players often repeatedly visit a game site to play.
This is why the best practices of serving Unity content includes precompressing it on disk ahead of time (for maximum compression factor and zero CPU overhead on the server), but declaring to the browser that the content should be uncompressed for serving.
The rationale of including .js files also in this scheme, is that we can ask then web hosts to only have to configure one type of scheme. If we mixed precompressed content and on-demand caching content, we would have to ask web admins to configure both mechanisms into their web servers: they’d still need to set up Content-Encoding headers, and in addition, they’d have to make sure their on-demand compression caches are working the best possible.
It is simpler to just expect the web admins to have to set up one or the other, and not always both.
Using Unity precompressed files? → set up Content-Encoding and Content-Type headers
Using uncompressed on-demand caching? → set up your on-demand caching subsystem