Hey. I’m attempting to write a pixel-based font renderer, so I can pass in font textures and draw them to the screen. I’m currently attempting to automatically calculate kerning values, based on the leftmost and rightmost pixels of each character (testing whether the alpha channel is above 0.5). However, the code is never being called, as it seems GetPixel returns the value (1,1,1,0) in all pixels, regardless of the fact that the texture does not contain this. I’m at a loss here, as the texture is drawing correctly, and everything is working except the GetPixel call. I have even tested it by locating a pixel that I know is not white and not transparent manually, then called GetPixel on that (GetPixel(98,20)), and yet it still returns (1,1,1,0). Does anyone know what might be happening here?
Ah, I’ve discovered the issue. Turns out I’m still not used to Unity’s bizarre formatting for textures. The majority of systems have their origin at the top left corner, but Unity’s textures seem to start at the bottom left, which is actually rather awkward. It seems it was reading the bottom half of the texture (which is blank), and never reaching the top half (containing the actual font).
I’m not in a position to test it myself at the moment, but my theory is that GetPixel is returning a default color for you because the texture you’re querying is not Read/Write enabled in its import settings. Make sure this is the case, then try again.
Update: Okay, that wasn’t it. Let’s try something else:
Change the Texture Type in import settings to Advanced. Then:
Make sure no resizing to power-of-2 is going on by setting Non Power of 2 to “None”.
Make sure the Max Size is something large enough to actually hold the texture.
Make sure the Format is not something that compresses the texture on import, i.e. RGBA 32 bit seems a good choice.