This topic is surprisingly complicated so I understand if you are confused and/or frustrated. But I believe it works as intended (although our UX is suboptimal so I fully understand why you’d think otherwise).
Why does Unity pick BC6?
My guess is that Lightmap Encoding is set to “High Quality” in your project, at least for the WebGL build target. You can find this setting in Project Settings → Player → Per Platform → Other Settings. This tells Unity that you want lightmaps to be encoded (which is orthogonal to texture format and texture compression) as HDR floating point values (as opposed to RGBM where each channel is constrained to the [0, 1] range). Since the lightmap values are HDR, then Unity chooses a texture (compression) format that is compatible with HDR. This explains why it chooses BC6 in your case. If you change Lightmap Encoding to Normal or Low you will probably see that Unity chooses a different texture that is better suited to that case. The reason Unity doesn’t choose DXT3 is that this is a LDR format and hence it isn’t compatible with lightmap values encoded as HDR.
Then why is the behaviour different from older Unity versions?
For the WebGL target, it used to be the case that a “Lightmap Encoding” of “High Quality” should resolve to RGBM (which is a LDR encoding). This allowed Unity to choose LDR texture formats, such as DX5/BC3 and indeed it did. But at some point, it was decided that a “Lightmap Encoding” of “High Quality” should resolve to HDR when using WebGL. I’m not sure about the exact reason for this, but I suppose it was because WebGL got proper HDR support, and in that case HDR is almost always the better choice. The change meant that project which had a “Lightmap Encoding” of “High Quality” would now resolve to HDR, and in turn this meant that Unity started choosing another texture format.
Why can’t I enable crunch compression?
The “Use Crunch Compression” toggle does not mean that Unity will necessarily use crunch compression. It only means that it may. Also, Crunch compression only works with some texture formats, and BC6 isn’t one of them (I don’t know whether this is true in general, but it is true in Unity). Therefore, in your case Unity is free to ignore the toggle. This is by design but I understand why the UI/UX may be confusing. Perhaps the toggle should be called “Allow Crunch Compression” instead.
Also note that crunch only affected disk size of the imported texture. It does not affect the size of the data loaded onto the GPU at runtime (the crunched data is de-crunched before uploaded to the GPU).
How can I fix this?
If my guess above is correct, you should be able to just choose “Normal Quality” as your “Lightmap Encoding”. This tells Unity to use an RGBM encoding and this will allow Unity to pick DXT5/BC3 as before. This allows you to use Crunch but be aware that this won’t reduce the size of the data loaded into GPU memory. It only affects the disk of the built game on disk. Note also that Crunch is a lossy compression and may cause unwanted artifacts. Also be aware that RGBM only supports a narrow bands of values (see top of this page). This means that if you go beyond those limits (e.g. by increasing your lighting intensities) you may see surprising results.
Therefore, to avoid surprising and artifacts, I generally I recommend you use “High Quality” and let Unity choose the best format. But of course, this comes with the drawback of large disk size (the GPU memory size will be the same regardless of whether it is crunched or not).
Let me know if this fixes your issue.