Alpha channel and Texture2D.GetPixels()

Ok. Here’s the deal:
I’ve got a nice little texture, all with transparent parts and everything.
It displays correctly when set to transparent shader.

I’ve got a texture atlas with 16x16 transparent textures. When I put them on models, transparency shows correctly. When I generate models dynamically, all is well. But. When I generate models from the alpha channel, it doesn’t see the alpha channel properly. Alpha seems to be 1.0f all the time.
Yes, the texture can be read(I did set the flag and advanced texture type) and what it reads is garbage.
Did some output after changing the texture there to white.

Color[] cols= tex.GetPixels((int)start.x, (int)start.y, (int)size.x, (int)size.y);
		for(x = 0; x<cols.Length; x++){
			Debug.Log(cols[x]);
		}

That outputs

RGBA(0.710, 0.682, 0.612, 1.000)

I guess this makes it certain that the GetPixels doesn’t work properly here.
Any ideas, why?
No texture format changed anything.
EDIT: It did.

  • 16 bit format doesn’t work with getpixels.

I’m kind of lost now.

ah yes, 16 bit doesn’t work with pixels indeed. oops 8)
Post a bug report (feature request) and drop me the case number, i’ll see if can squeeze it into upcoming version
As for now, as you trying to create smth from it - maybe you can just convert your textures to 32 bits?
EDIT: actually 16bit textures should print assert in console. Are you sure not 24 bits?

No, nonono… It doesn’t work properly.
Changing the texture format breaks the script.
But that’s about it. Every other format doesn’t work properly -
Still invalid values in the pixels.
EDIT: TO clarify:

  1. all my textures are natively 32 bit 4 channnel RGBA obne byte per channel textures in either PNG or TGA.
  2. In both cases the resulting Color[ ] is not valid, although the length seems to be good.
  3. 16 bit texture makes getPixels throw an assert.

whats the current texture format and as what was it exported and from where? (if png and photoshop, ensure to use superpng, the inbuilt one is known for doing more wrong than write)

Ok. The whole process from the beginning.

  1. I got the texture from another source, in .png, 4 channels, RGBA, 32bit per pixel. No idea what’s the editor.
  2. I put it in the Unity project, make a texture out of it.
  3. I put it in the material (transparent) on the example mesh, displays correctly(with correct alpha)
  4. I want to dynamically create objects based on alpha values in the texture. So I make the textura available for reading(advanced texture, read/write flag, choose format - automatic compressed)
  5. I get the texture by assigning it to a Texture2D variable in the script.
  6. Inside the script I take the whole texture using the code I provided up there.
  7. Models don’t seem to detect the alpha channel.
  8. I think - maybe the texture is flawed somehow.
  9. I open up GIMP, try editing it and saving as .PNG and .TIF.
  10. The Color[ ] is still wrong.
  11. I display the color values of pixels.(see up there)
  12. I cut out a small part of the texture and use a small part of that.
  13. If the texture is small(non-square 21x19) and square(16x16) it displays correctly, with correct values. Colors in the array are correct.
  14. I create a texture with some alpha in it. When it’s small, all works fine. When it’s bigger(than… say… no pattern, really… but let’s say 128x128, the Color[ ] contains garbage) It occurs when the texture is big(I work on a texture 256x256) and it doesn’t matter how small part You’re cutting out using the getpixels. Even if it’s 1x1, or 16x16 or 128x128, it’s still garbage. But when cutting from a smaller texture, all seems to be working.

That’s sound really strange. Can you try make texture not compressed and recheck?

tried all types of texture supporting alpha
Actually I did a test-project to see if it’s working. And it does work.
Now where did I put that gun…

But seriously - the same call on the same texture giving two totally different results? weeeeird.

I made a test scene and everything worked flawlessly.

Actually I’ve painted the cell I am using to get the alpha values red… and they’re from the bottom of the texture!
Now that’s bizarre. Especially since I used the very same procedure on a smaller image and it worked.
Oh well… will have to flip the coords then.

EDIT: but why would the array contain garbage data if the texture was white… no idea.

I can confirm that that reading the alpha value gives inaccurate results:
For example in phostoshop it´s 12 and in Unity it´s 20.
RGB is correct.

Any help here?