Grayscale (uncompressed) texture type

I’m not 100% sure this falls in the ShaderLab category, but I would like to suggest the addition of an uncompressed grayscale texture type. (So besides the Compressed, 16 bit and Truecolor presets.)

This type could be used for some light maps, occlusion maps or dirt maps among others. With compression these might not look very good, but they could be stored in just 8 bits per pixel without compression if without color information. I’m not sure all targets have support for this, but DirectX at least does. (L8 format in DDS.) It seems OpenGL has a GL_LUMINANCE for this, which is also mentioned to work in OpenGL ES. Truecolor is a logical fallback for any platform that doesn’t support it anyway.

Or maybe it’s already in there and I’m missing an option somewhere.

Texture type- advanced, format- alpha8. I’ts what you want?

Not really. In that case the information is actually stored in the alpha and has to be read from the alpha channel in the shader. I guess that’s also why it’s located under the Truecolor group. It’s mainly intended for cookie textures I think.

I’m looking for a solution that actually stores the data in a single color channel and duplicates it into red, green and blue during sampling. That way the shader doesn’t need to know the difference from other texture types. As far as I can see these texture formats are available in both DirectX and OpenGL, but are just not supported in Unity.

In your case, a greyscale duplicated into each channel is a 24 bit texture. You can’t ask for grey in RGB without driver support. You can only choose from this list:

TextureFormat.Alpha8 is your best bet, otherwise if you want it to work on all without a shader change, you will need to use 24 bit or RGB565 if you don’t notice any quality loss.

As you know, Unity doesn’t support it but it is a driver supported feature. For unity to support it, they’ll need to change quite a bit. I suggest you submit a feature request. In Unity’s case I suspect they simply expected Alpha8 to cover it. For me, it does cover it. Perhaps just tweak your shaders in the meantime?

It was kind of meant as a feature request. Also wanted to see if others would find it interesting. For now I’m just using 24 bit RGB textures in these cases, but it seems to me that supporting 8 bit grayscale textures could be quite useful. I would imagine it doesn’t take that much effort to add it to Unity, though it is of course ten times more work then it seems at first glance.

What is it you need it for? I’m not sure why alpha8 doesn’t satisfy your needs…

If you want to sample it to an RGB vector then you can simply use something like
fixed3 greyscaleRGB = tex2d(_MyAlpha8Tex, i.uv).aaa;

Or if you’re importing textures, you could write an asset importer script that automatically takes data from R, G or B and spits it into the alpha channel of a new image that can be used as alpha8.

Of course I could do all that, but that means I’d need a different shader for versions with grayscale input. For diffuse I probably won’t use it, but like I said I can imagine it comes in handy for a lightmap or a dirt map. That already means I need 4 different shaders that essentially do the same thing. (LrgbDrgb, LrgbDa, LaDrgb and LaDa, hope my coding system here is clear.)

So, considering that most platforms support an uncompressed grayscale format, it seems more flexible to me to keep the shader the same and let the hardware do the grayscale to rgb expansion. That way it’s up to the artist to decide whether a texture needs color information or not and nobody needs to worry whether a shader actually exists that support a grayscale coded version of the specific map.

Why not have an RGB texture with the same value in each color channel? I think what you’re trying to do doesn’t really make sense. Somewhere/somewhen the data has to be expanded from one channel to 3, so either you have to do it in the texture itself or you have to do it in the shader after it’s read the texture. There is no other possibility.

Hmm, until then, if you want to keep it as simples as possible for the user, you could write a custom material inspector and use keywords that get set by the inspector depending on the format of the map you plug in?

That should keep the shader code to a minimum, too.

Because an uncompressed grayscale texture is three times smaller (on disk, in memory and in gpu memory) than an uncompressed rgb texture. Not wasting storage and memory makes sense to me. There is no possibility to have a grayscale texture in unity right now besides storing it in Alpha8 format for example. My point is that in DirectX and OpenGL there is the option to use uncompressed grayscale textures and I wonder why this option is not available in unity.

I understand there are a 100 ways to just get grayscale information into a shader in unity. I’m just wondering why the most straightforward option is not available in unity. I like the option to save memory if my texture is grayscale only.

Until then I’ll just store a grayscale texture in a rgb texture of course. At some point I’ll might start using some other tricks to save memory on grayscale images. I just don’t see why there is no hardware support for them from the start in unity.

How about
fixed3 color = tex.a > 0 ? tex.aaa : tex.rgb
This should support both, as long as your colored textures have alpha channel equal to zero.

Hmm yes there is an INTENSITY8 format in OpenGL for example that could be used, but Unity doesn’t make it available. I guess it’s up to them, their reasons and demand for it. Similarly there’s been an glTexSubImage2D() function in OpenGL since version 1.1 like a decade ago, which allows you to upload only a part of a texture to an existing texture without having to upload the entire thing, which would make a lot of algorithms way faster, but Unity has never implemented it either.

I just found this, and will look for more, hopefully this mod or others is still around.

Can unity add support for Literal 16byte images? Each pixel is a Ushort or short. Would be nice… the whole 24bit isnt relevant and is causing my conversions to get 8bit data…

edit to add… its also grayscale and not rgb. It needs to be that so that I can use the data setup in the inspector.

Today’s Unity has support for vastly more formats than when this thread was originally written.

You can use R16_UInt if you want a 0 to 65535 integer, or R16_UNorm if you want a 0.0-1.0 value.