The Unity editor attempts to match the target platform and quality settings its on. For instance, when set to iOS, with a quality setting that halves all texture sizes, it will do this in the editor to give you a proper preview of how things might look.
However, this creates a lot of problems for editor tools, because it’s non-trivial and non-obvious to handle all of these cases.
For instance, the terrain’s height map render texture can be in different formats based on platform, so one might use its descriptor to construct a matching one. However, when LOD is set to half, the descriptor will return a half sized texture, and when set back on the terrain, will be the wrong size. Easy enough to fix, but you don’t discover this stuff until someone runs with half size textures, so it tends to create lots of subtle bugs you don’t notice until later. MaxLOD is also a problem because it completely destroys LUTs.
If I’m set to, say, iOS, then my textures are compressed in the format native to that platform. Again, great for preview, bad for editor tools. When processing texture data in the editor, I often want to access the highest quality data I can. My general pattern for this:
- Switch the data to uncompressed
- Change it’s size to max resolution
- Graphics.Blit the data into a buffer so I don’t need to set read/write
- Switch the data back to compressed and recompress it (slow).
And this still has the maxLOD issue, btw. This slows down tools massively, as compression is often slow.
Anyway, what I would like to know is if there is any consideration for having editor functions that allow us to get this type of data without the emulation interfering with things. Something like a:
UnityEditor.Rendering.LoadSourceTextureData(“”);
that could be used to get texture data unaffected by compression, systems settings, platform settings, etc.