Why is NativeArray always shorter than the number of pixels in the texture?

I am trying to serialize a 512х512 texture into bytes for saving to file using GetRawTextureData, because this way will not generate garbage… But I am getting error.

IndexOutOfRangeException: Index 87381 is out of range of ‘87381’ Length.
Unity.Collections.NativeArray1[T].FailOutOfRangeError (System.Int32 index) (at <2fae0a4cbcec42c9acc616494aa88f69>:0) Unity.Collections.NativeArray1[T].CheckElementReadAccess (System.Int32 index) (at <2fae0a4cbcec42c9acc616494aa88f69>:0)
Unity.Collections.NativeArray`1[T].get_Item (System.Int32 index) (at <2fae0a4cbcec42c9acc616494aa88f69>:0)

I don’t understand why the native array is less than the number of pixels in the texture. Why the length is 87381, because my texture has the number of pixels 512x512=262144. How do I get this to work?

    NativeArray<Color> data;
    byte[] byteArray = new byte[262144];
    byte layer = 0;

    void Save() //not work
    {
        data = tex.GetRawTextureData<Color>();
        int index = 0;

        for (int y = 0; y < tex.height; y++)
        {
            for (int x = 0; x < tex.width; x++)
            {
                if (data[index].r == 1) // **ERROR THIS LINE**
                {
                    layer = 0;
                }
                else if (data[index].g == 1)
                {
                    layer = 1;
                }
                else if (data[index].b == 1)
                {
                    layer = 2;
                }
                else if (data[index].a == 1)
                {
                    layer = 3;
                }

                byteArray[index] = layer;
                index++;
            }
        }

        File.WriteAllBytes(Application.persistentDataPath + "/byteArray_" + i.ToString() + ".byte", byteArray);
    }

Short answer

byte[] byteArray = null;
var rawDataPtr = texture.GetRawTextureData<byte>();
  
// method #1
byteArray = rawDataPtr.ToArray();
  
// method #2
if( byteArray==null || byteArray.Length!=rawDataPtr.Length ) byteArray = new byte[ rawDataPtr.Length ];
rawDataPtr.CopyTo( byteArray );

Longer answer, a byte

“RawTextureData” in GetRawTextureData means this data is what your GPU receives.

(Spoiler alert) It’s rarely Color[] i.e. 16 bytes/pixel.

Depending on it’s TextureFormat, this very well may be just compressed byte[] - polar opposite of easy to understand & uncompressed Color[] you expect it to be - thus breaking easily predictable relation between array size and number of pixels.

Valid <T> in GetRawTextureData<T>, for uncompressed textures, can be chosen depending on texture.format property:

  • TextureFormat.Alpha8 - <byte>
  • TextureFormat.R8 - <byte>
  • TextureFormat.R16 - <ushort>,<byte2>
  • TextureFormat.RHalf - <half>,<byte2>
  • TextureFormat.RFloat - <float>,<byte4>
  • TextureFormat.RGB24 - <byte3>
  • TextureFormat.RGBA32 - <Color32>,<byte4>
  • TextureFormat.RGBAHalf - <half4>,<byte2x4>
  • TextureFormat.RGBAFloat - <Color>,<Vector4>,<float4>,<byte4x4>

so number of pixels and array length will match perfectly.

Bestiary:

public struct byte2 { public byte x, y; }

public struct byte3 { public byte x, y, z; }

public struct byte4 { public byte x, y, z, w; }

public struct byte2x4 { public byte2 c0, c1, c2, c3; }

public struct byte4x4 { public byte4 c0, c1, c2, c3; }