Texture Formats

Hi there!

I see this page Unity - Scripting API: TextureFormat

and I wondering which format is better?
and which format is lighter and very high performance?

I saw a lots of Format and I don’t know what is difference?

anyone?
Thanks!!!

There is no best format. Depends on what you need. Generally, the fewer bytes you can get away with the better because you are using less bandwidth.

Compressed formats only work for textures but not for render textures. Compressed textures use much less VRAM but there is some quality loss. There are different texture compression algorithms on different platforms. You need to check which algorithm is supported on your target platform. Also, the compression formats have different use cases, like for example normal maps.

Ask yourself how many channels you need (just R for monochromatic/masks, RGB for color, RGBA for color with alpha). Ask yourself whether you need alpha blending (8 bit alpha) or just alpha cutout (1 bit alpha).

Ask yourself whether you need values outside of a 0-1 range (for example for HDR colors). If yes, you are gonna need a floating point format. Do you need full precision or is half precision enough?

Or do you need to store integer values? In that case you’d need to pick an integer format.

Do you use linear workflow - in which case you might need a SRGB format.

Or do you need a depth or stencil buffer? How many bits of precision do you need? You usually have the choice between 16 bit depth, 32 bit depth and a combined depth-stencil format that used 24 bit for depth and 8 bit for stencil.

2 Likes

Thank you very much for your answer!
I just want colored texture with sprite atlas and alpha channel for 2D game for mobile,
which format I should choose?

RGBA32 is a safe choice if you don’t know what you are doing :wink:

1 Like

Thank you very much!!!

Yes! it similar to my choice ,I used ARGB16 Bit instead,
because I thought 16 should be lighter than 32

But, actually I still don’t know…

Another thing you can do is- click on one of the textures in your project window and look at what texture type that Unity automatically chose to use for that texture. If you play with different import settings, you can see that the texture type might change depending on what features the texture requires from the options you’ve selected.

1 Like

Thanks!!

So I just set it to Automatic right?
Unity should automatic selecting.

I’m talking about this:

See in the image how it says “Compressed DXT1” over the image preview? You can see that same import settings screen in your own project. If you turn-off compression, for example, the type will change.

The idea is, if you want to know what format to use when coding, just find a texture in your project that has all of the same properties and see which type that Unity chose.

1 Like

Ok, I just want to set it to low for high performance as possible, my game just use texture for sprite with alpha channel.

ARGB16 is only 4 bits per channel, so red, green, blue and alpha can only have values of 0-15. RGBA32 gives you 8 bits per channel for values of 0-255.

While it could be an interesting challenge to work with a limited colour palette, the last GPU I know of that natively supports palletised textures was in the PS2. Colours get converted to floating point on the GPU anyway, and the GPU will most likely give you better performance, better quality and lower memory usage if you use a compressed format like DXT (on PC) or ASTC (on mobile).

Just select Automatic in the texture’s compression settings and Unity will use the right one for your platform. It will also automatically use a format that supports an alpha channel if you have it enabled on the texture.

2 Likes

Thank you very much, I forgot this, because I also use Crunch Compress Texture, and it always Automatically format

If you want to use the most modern texture compression formats, you might want to disable crunch compression.
https://docs.unity3d.com/Manual/class-TextureImporterOverride.html

On PC the latest compression format is BC7, but only DXT5|BC3 and DXT1|BC1 support crunch compression. Therefore, if you use Automatic texture compression with crunch enabled it will use DXT5|BC3 instead of BC7. These older formats are still useful if you want to support older hardware though.

On iOS and Android there’s the same issue. The latest format ASTC does not support crunch compression so enabling it will default to ETC2 or ETC.

There’s a lot of useful information on that page. BC7 takes up half as much memory as ARGB 16 while offering almost the same quality as ARGB 32. Also, the GPU always treats colours as 4 channel (RGB + alpha) so an RGB 24 texture takes up the same amount of VRAM as ARGB 32.

2 Likes

What would be the best TextureFormat for the Steam Deck? I’m specifically looking for something that gets loaded / unloaded into memory very quickly. Footprint size doesn’t matter as much.

The Steam Deck’s APU is considered a desktop GPU rather than a mobile one so you can find the recommended formats here.

Since you want the best performance, use the formats in this table in the column “Low quality (higher performance)” for “Windows, Linux, macOS”.

  • RGB → DXT1
  • RGBA → DXT5
  • HDR (RGBA Half) → BC6H

Select the format based on whether the texture has an alpha channel and if it needs to be HDR. You might want to use an HDR texture for the skybox since the sun is emitting light rather than just reflecting it.