How to transcode an HDR supported texture format

Hi there.

I am attempting to decode a lightmap (RGBD PNG) into a RGB9e5Float texture or a RGBAHalf texture.

I am able to load my own data into an RGBA32 texture pretty easily with the Color32 struct…but I cannot find a corresponding struct or sample code to load data into a RGB9e5Float or RGBAHalf texture.

I feel like I am missing something trivial and the unity docs aren’t making things clearer for me.

Any help would be greatly appreciated

All HDR images simply use values larger than 1.0, but the way this is encoded is probably case-by-case, depending on the exact encoding format. I don’t know the exact specifications, but Color32 for sure cannot grasp HDR values because it’s limited to 8-bit values

Check this out for use with normal floating-point Colors ColorUsageAttribute

Hi orionsyndrome

Thanks for replying

I am aware that Color32 wouldn’t work for me, but it is the only struct that is available to me. I’d expect to find a RGB9e5Float struct and a RGBAHalf struct.

So I have an array of bytes that are transcoded in RGBA32 i.e I am able to call:
texture.LoadRawTextureData((IntPtr)data,(int)length); on a fresh RGBA32 texture.

The alpha value of each pixel is going to be used a divisor to achieve larger HDR values. The logic looks something like this:

float3 DecodeRGBD(float4 rgbd)
{
    return rgbd.rgb * ((MaxRange / 255.0) / rgbd.a);
}

(Assuming MaxRange is 65025)

I do not want to do this calculation at the shader level - this would be inefficient. I want to use pack this transformed data into an HDR compatible (ideally compressed) texture format. I am having trouble achieving this in unity.

Do you mean on the C# side or HLSL? Because there is also Color, which is the struct you actually want. Color32 is strictly a 32-bit integer (4 bytes), while Color is 4x 32-bit floating points (16 bytes). You can’t possibly pack HDR information with Color32, and Unity likely expects you to avoid it, at least that’s my (educated) guess. Now I wish I had some solid experience with HDR from the engineering side of the things, but alas, I haven’t done anything with it, if we don’t count gawking at the end-product of ToneMapper.

If I wasn’t clear, you simply don’t have enough information entropy to encode HDR with Color32, because it’s used to encode 24-bit True Color + alpha, that’s just 8 bits per channel or 256 shades. You can divide all day long, but the losses are immense (with max range of 2^9 you have only 128 values to encode LDR; with max range of 2^10 only 64 etc; you are progressively losing information such that for a max range of 2^16 your entire LDR fits into just 1 value, black or white). Maybe you can make HDR Nintendo Gameboy with it, if you allow me to indulge myself with a bad joke.

C# side.

Ah thanks for that information. I will try that struct with a RGBAFloat texture format.

I think my problem comes down to low level data manipulation in C#.

Haha I too wish that gawking at tonemapper results gave me HDR powers!

1 Like

Don’t forget that attribute as well. I think it matters.

I.e. (this was the actual example I found)

[ColorUsage(false, true)] public Color hdrColorWithoutAlpha = Color.white;

Thanks orionsyndrome

I am aware of this attribute have used it. However this problem I am dealing with doesn’t involve public input - it’s purely data.

I know what you mean, but frankly, haven’t used it, maybe it grants additional powers in other contexts? It’s hard to tell.