Hi,
I’m doing some preprocessing in some input images. My dataset is a bunch of jpeg files (each of them weights ~450KB).
What I do is I take those images and write some value in the alpha channel. To accomplish that I do the following.
- create a render texture (ARGB32 format, same width and height of input images)
- render a full quad, write the right value in the alpha channel of the render texture
- instantiate a Texture2D (RGBA32 format, same width and height of input images)
- read current values into the texture using ReadPixels and Apply
- generate png data (bytes) by calling EncodeToPNG
- write those bytes into a png file with System.IO.File.WriteAllBytes
The new processed textures work fine but each of them weights ~4.5MB
That’s simply too much for datasets that has more than one thousand images.
I have tried using different formats for the RenderTexture and for the Texture2D (like argb half for rendertexture and rgba half for texture) but the weight is always the same.
I’ve also noticed that EncodeToPNG doesn’t accept any parameter to set a compression level.
Is there any way to get these images to weight less?