When point filtering is set, does the video card still sometimes return a blended average of two pixels in a texture (perhaps if the UV coord is right “between” the two pixels?) I’ve written a shader that manually handles mipmapping, etc, and uses point filtering to read the raw pixel data without automatic smoothing, but there is still clear blending - e.g., if I add a test line of pixels to part of my texture (in this case, a bright green vertical line), the shader displays a second line next to it which is a blended average of the green line plus the other pixels next to it. It doesn’t seem to be caused by my code, so I’m wondering whether the video card is doing the blending even with point filtering enabled.
No one has any idea what would cause this? I’ve pretty much ruled out my own code. The apparent blending only occurs in the x (u) direction.
Have you turned off mip maps? Is the size of the texture larger than the maximum value defined in the inspector? Is the texture filtering forced on in your quality settings (rather than being per texture)? If you turn off mip maps and set filter to point then apply the texture to one of the built in shaders, does it still blend?
Also, pictures and code are helpful for us to help you.
Thank you for the suggestions; I’m looking into them now.
I remember reading someone (in another thread) saying that if you handle mipmapping manually in your own shader, you need to “pad” each section within the tile atlas to prevent color blending (or “bleeding”) across mipmap levels… does this imply that blending is always the norm even with point filtering, or am I misinterpreting something? Padding the sections would eliminate the biggest part of my problem, but not other potential problems later on such as reading a series of precise data values from a texture.
Hmmm, I’m not sure how much you know about mipmaps. The mipmap levels are basically smaller texture images with averaged colors, i.e. if you use any of the mipmap levels apart from level 0 you will always get averaged colors – even with point sampling. (As far as I remember you can look at the mip map levels generated by Unity in the Inspector by dragging the slider next to the texture preview.)
As I explained in a previous note above, I’ve disabled the usual mipmapping process so that my shader can do it on its own. My program uses a large texture as an “atlas” of tiles, each with a mipmap cluster next to the tile. Yes, mipmap levels 1-7 are deliberately averaged by my program of course, but the problem is that the entire atlas texture shows signs of averaging although my program doesn’t average the whole thing (only the mipmap levels). In other words, the full-size tiles (mipmap level 0) are averaged, too, and so are the ‘seams’ between different mipmap levels. This means that at certain distances, everything is tinted in the wrong color because of color blending or “bleeding” across mipmap levels. I realize that I can prevent the latter problem by leaving a gap in between the levels, but this doesn’t get rid of the averaging itself - which would become a big problem if I need to use part of that texture to pass precise data to the shader.
There are two kinds of point sampling: within each mipmap level and across mipmap levels. You should be able to avoid interpolation between mipmap levels by specifying only certain mip map levels. (I don’t remember whether it is 0, 1, 2, 3, … or 0.5, 1.5, 2.5, 3.5, …) If you specify numbers in between these values, I would expect that you see interpolation across mipmap levels. (In OpenGL there is actually an interpolation mode that avoids this interpolation, but I don’t know whether Unity supports it.)
EDIT: Actually, I don’t know how anisotropic filtering affects all this; but I would have assumed that point sampling within each mipmap level avoids all anisotropic filtering.
I had a chance to try it out myself and it looks like the “point” interpolation of Unity avoids interpolation across mipmap levels. (At least for texture2DLod in a GLSL shader; I wasn’t able to get tex2Dlod to work.) I made a small test and didn’t see any interpolation in any direction for a LOD of 0.49 .) The only thing I can still think of is the texture compression applied by Unity. Or your graphics driver settings. (Some graphics drivers allow to do crazy things like enforcing texture compression or enforcing mipmap interpolation, etc.)
In any case, as Farfarer pointed out: code would be helpful.
I’ve disabled the normal mipmapping, so that can’t be the problem.
And the blending occurs even when I replace all the custom shader code with just the following line to get a pixel using the raw UV values:
ThisPixel = tex2D(_TexAtlas,IN.uv_TexAtlas);
This line should just get an unmodified pixel from the texture and display each pixel as a square block of color when the viewer is at close range (since point filtering is enabled), and yet there is still evidence of blending - by which I mean some of the blocks have a color which is a mixture of two blocks, but without losing the blockiness of the pixels (i.e., point filtering is clearly working since there’s no smoothing, but some of the colors are clearly a blended average rather than what I actually put in the texture).
As you suggested, it seems likely that texture compression is the cause. What can I do to fix that?
I guess you can override Unity’s texture compression by overriding the format and size in the texture import settings.
(With respect to disabling normal mipmapping: some graphics drivers allow you to override what applications ask for and always enforce mipmapping. Thus, you should check the settings in your graphics driver even if you disable mipmapping in Unity.)
Sorry I’m so late to this thread, but what is the size of your texture? My guess is it’s non-power-of-two in the dimension you’re seeing blending. Unity resamples NPOT textures when they are used in non-GUI contexts, and the result is what you describe.
Yes - it’s 1536 x 1024, and the blending is only in the x (u) direction, so that must be the cause of the problem. So I just need to make sure the texture dimensions are powers of two, or is there a way to disable resampling? Does the texture need to be square (equal dimensions in both directions)?
You need to make your texture dimensions powers of two. Textures only need to be square if they are to be compressed on iOS.
Changing the texture dimensions worked. Many thanks - I was stuck for awhile because of this problem.