I’m doing runtime compositing of several layers of smaller textures into one big texture. (It’s for character customization).
I have the whole system running in the editor (made textures readable, made main texture uncompressed so SetPixels() works, etc)
but as soon as I deploy the app to my device, I get the following error about all the textures I’m trying to read:
UnityException: Texture ‘XXXXYYYYZZZZ’ is not readable, the texture memory can not be accessed from scripts. You can make the texture readable in the Texture Import Settings.
at UnityEngine.Texture2D.GetPixels32 () [0x00000] in :0
The settings are fine in the game (and like I said, it works in the editor) but it fails spectacularly in iOS. Attached is a screencap of the settings of one of my textures.
Has anyone else seen this?
[EDIT: I’m using GetPixels32()]
[EDIT2: Like it says in my call stack, derp me]
Using GetPixels() doesn’t seem to make a difference.
Also, I’m not using compressed textures because I thought that GetPixels and GetPixels32 both only worked with uncompressed textures. I turned on “RGBA PVRTC 4 bits” and I got this error (which wasn’t entirely unexpected):
“Unsupported texture format - needs to be ARGB32, RGBA32, BGRA32, RGB24, Alpha8 or DXT
UnityEngine.Texture2D:GetPixels()”
Ah interesting - I wasn’t aware of the texture format prevention when using PVRTC! Thank you for that info.
Okay… it does sound like something is wrong with the settings, even though what you’ve posted shows your settings are absolutely correct. I’m wondering… are you creating another Texture2D in memory and doing read / gets on that image?
That’s a good point: I’m loading a Texture2D (which is my base naked character) from an asset bundle and doing an alpha-blended blit directly into that texture. Lemme try working on a copy and see if that makes a difference. Thanks again!