Is there any way to use 16bpc/64bpp textures in Unity, other than splitting it into two 8bpc textures and combining them in a shader?
Unity supports 16 bpc floating point textures via .hdr and .exr files. This is a little different than the unsigned linear 16 bit format that something like PNG or TIFF support.
However there are some caveats that Unity will always apply a gamma curve to the data if you’re using a Gamma Color Space project. It also defaults to using BC6H, which is the only compressed format available on the PC that supports greater than 8 bits per channel. BC6H only supports RGB color data. If you need the alpha channel you have manually change the format to uncompressed.
There’s been word of some active work here internally at Unity to support 16 bit PNGs properly, but so far I haven’t seen it show up in the alpha releases of 2020.2.
2020.2 alpha has “Graphics: Added support for importing 16bit per channel integer formats without quantization. This also exposed new TextureFormats R16G16, R16G16B16 & R16G16B16A16” (from release notes).
Oh fantastic!! Just in time for me, hahah. I’m not sure if I can trust the current project to an alpha, but it’s good to know that it’s already working in the pipeline; maybe I’ll just settle for lower precision for now, and switch over once it’s stable Thanks!
Actually, I’ve installed the alpha (2020.2.0a15.1993) and I’m not seeing any of those formats appear as an option for a 16bpc PNG that I’ve added to the project… Is there some setting that I have to set on the project before I can use these new formats? (According to the release notes, it should be in by 2020.2.0a12: Unity 2020.2.0a12 )
I checked 2020.2.0a13 and it does appear to be in. Though the format names in the Texture Importer has never really matched what the internal format names are.
R16_UNorm = R 16 bit
R16G16_UNorm = RG 32 bit
R16G16B16_UNorm = RGB 48 bit
R16G16B16A16_UNorm = RGBA 64 bit
Compared to 2020.2.0a10, it is definitely not quantizing the image anymore. I tested 16 bit .png, .psd, and .tif, both in greyscale and RGB, and am seeing the expected results.
The results for .exr are still munged for non-color data, but that’s a separate problem.
Oh wow, your screenshot was the missing piece—it looks like they only appear under “override for PC…”, not under “Default” (even though my test project is ONLY for pc!)
Is that a bug that should be reported, or is that intended behaviour? (If so, is it documented anywhere?)
Not a bug. Working as intended. If you want a specific format you need to use the override to select it. Otherwise it’ll default to a “best guess”, which for most people is going to be a DXT1/5 texture.
Have a question about those new types of format. Does it mean that we can retrieve the value with some way from the image instead of using Color which loss a lot of information?