Download and use PVRTC texture? (iOS)

I’d like to download and use a texture compressed via PVRTC (in the .pvr container format, or any format for that matter) and use it at runtime?

From the WWW class documentation: “The data must be an image in JPG or PNG format”.

Is it possible to instantiate a Texture2D or Texture object from file data in-memory (similar to AssetBundle.CreateFromMemory)?

Yes, it is possible to instantiate a Texture2D from a byte array by using Texture2D.LoadImage().
That’s the short answer.

Though don’t get your hopes up just yet, because there’s a HUGE ‘but’:
This is barely of any use. And here’s why: For some reason, LoadImage only accepts byte arrays that contain JPEG and PNG formatted data. It does not accept ANY other data, none whatsoever.
And here comes the fun part: If the texture object you load this byte array to was created using a compressed texture format, it will compress your texture while loading. (At least on a desktop, never tried this on a mobile device, though I guess it should work.) Unity decides what kind of compression it will use depending on the image format. If you hand it a byte array containing a JPEG image, it will compress the texture without alpha and if you give it a PNG it will make one with alpha. In case you wonder: No, there’s NO way to make a texture without alpha from a PNG that was encoded in Grayscale or RGB color space. It will ALWAYS waste precious texture memory on an alpha channel, no matter if you need it or not. That’s just how Unity rolls, you know?

And if you think this whole automatic compression thing is a nice comfortable thing to have: It isn’t. Depending on the machine your application will run on, this can take a while. Just a few tenths of a second, but this adds up to quite a lot of time depending on the amount of textures you want to load that way.

And beware of bugs: If you, for example, think “Oh what the hell, lets’ use PNGs anyway, I can afford to waste a few kB of texture memory even though I won’t need an alpha channel!” you’re out of luck. Because there’s a nice little bug that fucks up texture compression when you use a PNG file that has no alpha. That is if you don’t mind having a bunch of textures that are all pink-ish.

Someone at Unity Tech thought: “Hey, let’s introduce a function that enables our beloved customers to load textures directly from a byte array, they’ll LOVE that!” and I’m actually quite grateful for that. But like so often lately they manage to completely fuck up a great and handy idea. Why not just write a function that gives us a way to write textures directly to memory?! They NEEDED to make it fancy with automated texture compression and stuff like that. I just want to write my DXT and PVRTC compressed textures to memory like any other realtime engine has to do it. That way I could finally start to encode textures by using external texture converters like nVidia’s texture tools for DXT or PVRTexTool for PVRTC. Especially the latter creates textures that are a LOT cleaner (meaning less compression artifacts) than the converter that Unity uses internally.

Sorry for the ranting, but I guess I had to vent this one for a while now.

This is a copy paste of my answer to the same kind of question, sorry :slight_smile:

I have managed to solve this for Android. I think the same method can be applied for iOS as “the beef of the code” is relying upon standard OpenGL API.

I’ve uploaded a functional demo of my solution to this problem:

There are two projects:

  1. DynamicCompressedTextureTest - Unity project
  2. CompressedTextureLoader - Eclipse/Android project which contains the sources for “Assets/Plugins/Android/fi.jar”

One thing I’d like to ask from you: When one (or some) of you guys port this to iOS (I haven’t yet taken the time), please send me a copy of the code (or post here)!

The idea:

  • Let Unity allocate and do the initial uploading of the compressed texture, the “TexturePlaceholder”
  • When new textures are needed, the placeholder is duplicated (GameObject.Instantiate()) and thus Unity allocates and manages the new copy of the compressed placeholder texture
  • Then, the Android code downloads a new compressed texture data in the background and once it’s done, it simply overwrites the Unity managed texture - by uploading the downloaded data using Unity assigned texture ID.

At the time of writing, I didn’t have a PowerVR equipped Android at my hands but I’ve made it working with PVRTC4 the same way some time ago. It’s just about making sure that …

  1. the TexturePlaceholder is compressed with the same method as the ones that are downloaded
  2. the Android code’s texture format identifier is set correct
  3. the expected data lengths are set correct (bytes per pixel)

All in all, I think the code is fairly straight-forward, easy to read (not that elegant, though) and shows the common pitfalls to avoid.

I hope this helped, please let me know!

Sorry for bringing up this long dead question up to a zombie state. But since I have found it through searching, someone else could make it here too.

Well, after searching here and there I have found to my big surprise two methods hiding right under my nose, that is inside the Texture2D class. These are them: Texture2D.LoadRawTextureData and Texture2D.GetRawTextureData (the latter also fit for writing texture despite the ‘Get’ part in the naming).

Seems to me this is the right way to do it.