Everything is fine within the Editor.
On a mobile build in iOS however, I get the Exception
ArgumentException: Texture2D.GetPixels: texture uses an unsupported format. (Texture ‘TheNameOfMySprite’)
What could be the problem?
Details:
Using Unity 2021.3.0f1.
The Sprite is loaded from an Assetbundle.
The source file is a .png file, pngcheck output is (256x256, 32-bit RGB+alpha, non-interlaced, 95.2%).
(in reality the image has just grayscale values, but that shouldn’t be of concern for the problem)
Import Settings:
Sprite Mode: Multiple
(There are two slices that overlap: one is the full image, another is a small subrect from within the image)
Long shot but I‘ve had inexplicable issues with png files in various engines/platforms and more often than not it was fixed by opening and saving the png in another program, or using a converter to convert to some other lossless format and back. I believe last time it was png files created by Inkscape that concistently created non-standard/unsupported png files.
Thanks - I tried to reexport as png with gimp, that didn’t change anything so far.
maybe I’ll try with something like a bmp, but I suspects its something else.
Input file format should have little effect on final exported build or operations with textures once in memory. As long as Unity can read them in editor then during the build Unity should convert them to the GPU native format for corresponding platform. Read this for more details Unity - Manual: Texture formats
If you get problems due to something not being supported. One of the first things you should do is read the docs for the function you are using Unity - Scripting API: Texture2D.GetPixel . It clearly explains the basic requirements and in which case it is not supported. And the case in which it says it doesn’t work, is one of the rare settings which can be set for each platform individually. So my guess is that there’s your problem.
For this function to succeed, Texture.isReadablemust be true and the data must not be Crunch compressed.
Regarding the first part, the doc for Texture.isReadable states To toggle this, use the Read/Write Enabled setting
As stated I do have Read/Write checked in the import settings and considered this to not be the cause of the problem (However I haven’t yet checked isReadable at runtime on the mobile device and will do so).
Regarding the second part I’m going to try to use GetPixels32 in order to be sure if crunch compression is the problem now.
However I’m afraid I simply do not understand the crunch compression situation. In player settings, Texture compression format is set to ASTC for iOS - From what I read I don’t think this is the issue or it would be resolved by setting it to PVRTC.
I thought that by setting Format: RGBA 32 bit in import settings, the texture would not be compressed. Is that incorrect?
So you know, Texture2D isn’t a 2D feature and isn’t something the 2D team have implemented or maintain; it’s obviously the standard texture for anything in Unity.
In that regard, I’ll move your post to the General Graphics forum.
While I don’t fully understand, why my texture apparently would be compressed on iOS, or how to control that, GetPixels32 did solve the Problem.
I had to implement a routine to Crop though since that overload is only available with GetPixels - its nothing special but I’ll share it:
public static Color32[] GetPixels32(this Texture2D tex,
int x,
int y,
int width,
int height
) {
var colors = tex.GetPixels32();
if (x < 0) x = 0;
if (y < 0) y = 0;
if (width > tex.width - x) width = tex.width - x;
if (height > tex.height - y) width = tex.height - y;
if (x == 0 && y == 0 && tex.width == width && tex.height == height) {
return colors;
}
var result = new Color32[width * height];
for (int yi = 0; yi < height; yi++) {
int yOffsetResult = yi * width;
int yOffset = (y + yi) * tex.width;
for (int xi = 0; xi < width; xi++) {
result[yOffsetResult + xi] = colors[yOffset + x + xi];
}
}
return result;
}