# Any way to achieve low poly effect without duplicating verticies

So, as I understand it, colors and lighting and all that stuff are applied to triangles based on the normal vectors of the vertices. The fragment shader will interpolate between the normal vectors to smoothly apply lighting and coloring across the entire triangle. If you want to make a low poly effect (basically meaning an effect where you don’t have this interpolation) you can give each triangle its own vertices that all have the same normal vector so then lighting and coloring will be applied the same across that triangle. The problem with this is that to render the same mesh you need 6 times as much vertex data, which is obviously not ideal. My question is, is there a way to make the fragment shader not interpolate between normals and instead just use one normal for shading the entire triangle.

You could use the ddx and ddy functions of the fragment shader to get two tangent vectors of the surface plane. You could then calculate the cross product of that to find the normal vector for that flat surface. This could then be used to do flat shading for that surface. The problem with this is that it’s super demanding as you would need to be calculating a normal for every fragment. So, ideally, I wouldn’t want to do this solution either.

One thing to keep in mind is that shader performance is only as good or as bad as you measure it to be - recalculating the normals every fragment via derivatives is probably perfectly fine in most cases, so I would only rule it out if you know it has a visible impact on your game’s performance.

That aside, there are a couple of other ways you could do this in a shader:

1. Geometry shaders - this one is the most direct (i.e. literally defining one surface normal for every face) however it is also probably the least recommended. To slightly contradict my advice in the beginning, geometry shaders can have a significant performance impact and aren’t properly supported everywhere. That + lack of support outside basic vert/frag shaders can make them a bit of a pain to work with in Unity.

2. nointerpolation - you can add the nointerpolation keyword to your vertex normals output field and everything else should just be taken care of. This is probably the way I would recommend, however this also won’t work with surface shaders as the vertex data struct will be regenerated when the shader is compiled.

As much as it might not be ideal, duplicating the vertices is probably the most straightforward and well supported method - as long as your meshes aren’t super high poly I don’t necessarily see this being a bad option, especially when Unity lets you do this directly in the mesh importer. Ultimately if your concern is performance I would suggest creating some kind of stress test and just trying out each of the methods. See what works best for your needs.