Why do some of the pixels on the plane disappears(or less detail) as the distance from the camera increases?

133618-n1.png

As shown in the picture above.my scene is nothing more than straight plain road mesh. i just give my long road mesh(not square) a texture,with perspective camera setup. As you can see, the pixels of lane line in the middle of road mesh at the distance are gone(Figure 1),and Figure 2 is the correct result i want. what’s wrong with texture sampling? how can I solve it? your any hint will be appreciate, thank you in advance.

EDIT:
I found out that if i set the screen resolution to another one , the artifacts can be alleviated. then why?

You should read about MipMapping and Anisotropic filtering. Textures like yours (which have high contrast, sharp seams) generally suffer heavily from “minification” during texture filtering since you have to fit more texels into a too small screen area.

So check your anisotrophic filtering setting of your texture as well as the quality settings.