I’ve looked up and down for an answer on this, but can’t find anything except answers that tell me to turn of mipmapping and set filter mode to point, which I’ve already done, and yet the problem persists.
I’m trying to make a procedurally generated mesh and in between every two triangles there is a line made up of the color of a neighboring texture from my texture atlas - in addition, it gets worse the farther away I get.
I’ve also seen suggestions to add extra space around each individual texture, but I’d rather avoid doing that if I can.
Try insetting your UV coordinates by
.5*(textureSize/2f) so that sampling will occur in the middle of a texel rather than at the border between two texels. I’ve drawn a diagram to try to show what I mean:
UV coordinates like those of the red outline in the upper left (.1,.6 to .4,.9) will get bleeding due to rounding errors. Using UV coordinates like those of the red outline on the right (.65,.45 to .85,.65) will mitigate the bleeding. The downside is that when the quad is viewed up very close, the grey border pixels may be noticeably cut in half. If that is unacceptable, the only solution is to go back to defining the UV coordinates along texel edges and doubling the border pixels, as with the red outline in the lower left (.1,.1 to .4,.4).
To add to zach.r.d’s answer you dont have to use 0,5 times the pixel width/height. I did this with a 16x16 pixels tilemap game and it looked horrible. What i did is instead of 0,5 times i did it with 0,01. It also eliminates bleeding but its pretty much invisible to the eye.
For reference iam using a custom unlit shader