ARGB32 ... why how?


I have a script that writes on a texture that works perfectly in editor mode, yet when I compile the texture stops updating.
I believe this might come from the fact that the original texture is a psd that’s been overriden as argb32 in import settings, so I’d like to simply create the texture in that format, without the need to override it.

But I’m struggling to find information on HOW or WHAT creates textures in that format.
Simply put: what’s the extension of an ARGB32 file?
I’ve tried using photoshop and the dds exporting tool, to no avail: the texture format is reported correctly, but the image properties can’t be seen in the inspector like so…

Please help, I can’t find a reason why it works in editor mode and doesn’t work in executable mode.

I’m not sure as to why your scripts works in the editor and not when compiling as executable.

That said, ARGB32 is not a file type (hence, no specific extension to that), it represents the “raw” data format of your image, it stands for “alpha,red,green,blue 32bit”, that is if you visualize your actual image data as a long array of bytes (which it is really), each pixel is represented by 4 bytes where the first byte is the alpha value, the second is red, third is green and fourth is blue - hence the 32bits per pixel.

The image file itself contains additional information besides the raw data, it has various headers and meta information stored in the file, and often the raw data itself is compressed and not stored as an actual linear array of bytes. The ARGB32 refers to the format of the raw data you extract from the file after going through the file-type specific mechanisms - luckily for us, Unity handles all the file type specifics for us when we import assets or load them at runtime, and we are left with only the actual image data.

That said, as for which file types support ARGB32 - pretty much anything that can encode images in 32bpp color depth and supports alpha layer, PSDs and PNGs come to mind.

Also, make sure you’re accessing the data of the texture using GetPixel32/SetPixel32, these methods use the Color32 class that supports 32bit colors. Another thing that comes to mind would be to see if you haven’t selected any options to compress the textures in the build settings - I know these features are available when targeting iOS and Android, but there might be a similar setting for Standalone apps as well.