Thin triangles GPU performance issue

Are thin triangles a performance killer in Unity? Does it depend on if they become subpixel width only?

I’ve heard it causes an issue with rasterisers (tiled deferred, standard).

Has anyone come across any problems with this?

Thin and small triangles can be a performance issue due to redundant overshading of edge pixels on any GPU. It’s not something Unity can solve really. What impact they have really depends on a few factors - the complexity of your shaders, what kind of hardware you’re running on, and how many triangles you’re pushing. There’s some good info here. Do you have a particular use case where you are seeing issues?

-sam

1 Like

It is true though I hardly believe that it makes any real impact on performance unless you target GPU critical Devices, and even then we talk about 1-2 Frames maybe, but it is just a guess. The Hololens for instance recommends not to use such thin triangles. So if you can avboid them, do it. But I doubt that getting rid of thin triangles by adding more is the right solution.

1 Like

Thanks for the advice. A lot of our models have bevels on them so I was a bit worried about it. I’m assuming this is an issue for both opaque and transparent objects?

Yeah, I think it could be worse for transparent triangles, because the hardware won’t be able to reject fragments as part of the early-z test since they are blending with the framebuffer. With thin opaque triangles it’s possible that some fragments do end up getting rejected.

-sam

It’s not just about thin triangles, but density. Good practises with LOD easily eliminate such issues.

3 Likes

Sure, the real issue is overdraw/overshading.