2 Years and and "use Crunch Compression" still not work on lightmap file

Hello,

Maybe some of you have more informations about this, but 2 years ago i have make multiple reports explaining that the lightmap files can’t receive the crunch compression : Crunch compression not working on Lightmap file (2020.3.31f1)

Prior to 2020.3, WebGL used DXT compression for the lightmap file, so for a lightmap file of 1.3MB :

  • Select the lightmap file and click on “use Crunch Compression” on the default tab, the file is now 58.2KB without big change on the quality. So everything was working like expected.

In 2021.2 editor versions and newer, lightmap textures now use BC6H compression. If you do the same actions :

  • Select the lightmap file and click on “use Crunch Compression” on the default tab : the size of the file stay at 1.3MB, so this option do not do any action.
  • If you go on the WebGL tab, click on “Override for WebgGL”, select RGB Crunched DXT5|BC3, the file is now 58.2KB but the lightmap file become dark and not usable.

This video explain show the issue :

I want to upgrade for Unity 2023.x.x but this is a big issue for me because i have to rebake all my lightmap and because of the crunch not working anymore, the size of my lightmaps is higher of 4/6MB on each scene.

Thank you and have a good day.

Could I get you to open a bug report with this? it looks like a regression and if it is, our bisection service can pinpoint exactly when it occurred which will make this much much faster to fix.

I did long time ago and it was not fixed : https://issuetracker.unity3d.com/issues/webgl-dxt5-compression-produces-lower-quality-lightmap-results-when-compared-to-2019-dot-4-lts

Another link from me too : https://issuetracker.unity3d.com/issues/crunch-compression-is-not-working-when-used-on-a-lightmap-texture on IN-29539

This topic is surprisingly complicated so I understand if you are confused and/or frustrated. But I believe it works as intended (although our UX is suboptimal so I fully understand why you’d think otherwise).

Why does Unity pick BC6?
My guess is that Lightmap Encoding is set to “High Quality” in your project, at least for the WebGL build target. You can find this setting in Project Settings → Player → Per Platform → Other Settings. This tells Unity that you want lightmaps to be encoded (which is orthogonal to texture format and texture compression) as HDR floating point values (as opposed to RGBM where each channel is constrained to the [0, 1] range). Since the lightmap values are HDR, then Unity chooses a texture (compression) format that is compatible with HDR. This explains why it chooses BC6 in your case. If you change Lightmap Encoding to Normal or Low you will probably see that Unity chooses a different texture that is better suited to that case. The reason Unity doesn’t choose DXT3 is that this is a LDR format and hence it isn’t compatible with lightmap values encoded as HDR.

Then why is the behaviour different from older Unity versions?
For the WebGL target, it used to be the case that a “Lightmap Encoding” of “High Quality” should resolve to RGBM (which is a LDR encoding). This allowed Unity to choose LDR texture formats, such as DX5/BC3 and indeed it did. But at some point, it was decided that a “Lightmap Encoding” of “High Quality” should resolve to HDR when using WebGL. I’m not sure about the exact reason for this, but I suppose it was because WebGL got proper HDR support, and in that case HDR is almost always the better choice. The change meant that project which had a “Lightmap Encoding” of “High Quality” would now resolve to HDR, and in turn this meant that Unity started choosing another texture format.

Why can’t I enable crunch compression?
The “Use Crunch Compression” toggle does not mean that Unity will necessarily use crunch compression. It only means that it may. Also, Crunch compression only works with some texture formats, and BC6 isn’t one of them (I don’t know whether this is true in general, but it is true in Unity). Therefore, in your case Unity is free to ignore the toggle. This is by design but I understand why the UI/UX may be confusing. Perhaps the toggle should be called “Allow Crunch Compression” instead.

Also note that crunch only affected disk size of the imported texture. It does not affect the size of the data loaded onto the GPU at runtime (the crunched data is de-crunched before uploaded to the GPU).

How can I fix this?
If my guess above is correct, you should be able to just choose “Normal Quality” as your “Lightmap Encoding”. This tells Unity to use an RGBM encoding and this will allow Unity to pick DXT5/BC3 as before. This allows you to use Crunch but be aware that this won’t reduce the size of the data loaded into GPU memory. It only affects the disk of the built game on disk. Note also that Crunch is a lossy compression and may cause unwanted artifacts. Also be aware that RGBM only supports a narrow bands of values (see top of this page). This means that if you go beyond those limits (e.g. by increasing your lighting intensities) you may see surprising results.

Therefore, to avoid surprising and artifacts, I generally I recommend you use “High Quality” and let Unity choose the best format. But of course, this comes with the drawback of large disk size (the GPU memory size will be the same regardless of whether it is crunched or not).

Let me know if this fixes your issue.

2 Likes

If HDR is disabled in the rendering pipeline, in our case URP, shouldn’t that setting override the lightmap settings? Or is it only HDR in the sense that it uses additional lighting data to create a more precise map without it actually being encoded in HDR for display?

Especially considering HDR is not documented as supported by Web according to the Unity Documentation;
9749293--1395187--upload_2024-4-4_8-26-44.png

Nevertheless, this was a great read and indeed, it is badly documented, especially for WebGL. Lightmaps are easily 10-20% of our build sizes and being able to crunch compress, even with artifacts (partially mitigated with higher resolution), saved 50% of the size.

One quick fix to the documentation is to tie in the “lightmap encoding” in the player-settings page: https://docs.unity3d.com/Manual/class-PlayerSettingsWebGL.html to the well documented https://docs.unity3d.com/Manual/Lightmaps-TechnicalInformation.html page.

Client loading times and download times are the bane of the Web Platform with Unity…

2 Likes

Thank you for all these informations and support.

The “Normal Quality” of the “Lightmap Encoding” in the project settings solve the issue.
It’s a bit sneaky but now, i’m able to crunch my lightmap like before.

Have a good day.

1 Like

it’s clear we need to update our documentation with better information about this. I’ll pass this on to our docs team.

2 Likes

Before I can answer that, can you please expand what you mean by “If HDR is disabled in the rendering pipeline”?

Regarding the docs screenshot you posted: The docs are wrong, and I, coincidentally, updated that just a few days ago (I don’t know when this change will hit the official public docs, but hopefully soon). After my change the WebGL row says “RGBM / RGBM / HDR”. cc @unityruba

1 Like

So for HDR, using Unity 2022.3.22f1, we have it disabled in the quality settings and there’s no HDR in the player settings;
9766647--1399194--upload_2024-4-12_9-29-17.png

I would thus assume there’s some level of stripping, optimization happening to ensure it is not taken into account.