Using a 16bit texture? (for gpu terrain)

I’m rolling my own terrain implementation, but i’ve run into a problem. While Unity accepts an EXR, it wont let me retain the 16bit data, even as a compressed 8bit RGBA. Is there a way to do it? Or should I come up with a Photoshop action that emulates EncodeFloatRGBA()?

Never mind, I made a quick console app which converts it for me, mimicking EncodeFloatRGBA - I’ll make sure it’s fine tonight and post it. It’s incredibly basic - it will search for a grayscale 16bit png in the same directrory that isn’t called output.ong, and poop out an 8bit, 4 channel image called output.png.

I’m having a bit of trouble with the C# interpretation. The result is that all channels except R are black, but by looking at the code I kinda expect it to be that way - so I don’t get it.

        private Float4 EncodeFloatRGBA(float v)
        {
            if(v < 0f || v > 1f)
                throw new Exception("Out of range");

            var kEncodeMul = new Float4(1f, 255f, 65025f, 160581375f);
            const float kEncodeBit = 1f / 255f;
            var enc = kEncodeMul * v;
            
            enc = Float4.Frac(enc);
            enc -= (new Float4(enc.G, enc.B, enc.A, enc.A) * kEncodeBit);

            return enc;
        }

Float4 is here: namespace HeightmapEncoder{ using System; using System.Drawing; - Pastebin.com
(the return value of 255 for the alpha in Float4.ToColor() was for testing)

edit: I think Window’s Image.FromFile() is reading it in at 8bit, thus the red channel.

edit2: That was it - rolled a FreeImage solution and it looks more like it.

Somethings off…

Original heightmap (16bit grayscale png) and output (8bit png):
http://www.dukecg.net/Frontier/Heightmap16to8.zip

Personally if the data is 16 bit i’d use ARGB4444 (which itself is 16bit obviously) (or is it RGBA444) and of course with no compression, though you’d need to work out the new encoding/decoding.

As to your code you posted are you aware that you have GBAA instead of GBRA?

I don’t remember there being a GBRA format isn’t it meant to be ARGB? Though I guess as long as you are consistent it wont matter. Though you may have to account for endianness.

I’m pretty sure he’s talking about 16 bits per channel, not 16 bits per pixel :wink:

I was just following the shader code that does the same with .yzww

Swapped things around a bit - used RBGA in the shader, so maybe something is messed up elsewhere. Whatever - it looks great!