# Possible to interpolate non-linearly across a polygon?

Normally UV coordinates, surface normals, and vertex colors are interpreted linearly between the vertices. But I’m wondering today if it is possible to change that. For example, maybe I want the value I’m interpolating to change quickly at first, then hold a steady value across most of the polygon, and then change quickly again as it approaches the other side.

Or if we want to get really fancy, we put an AnimationCurve on the shader, allowing the artists to set the interpolation function however they like on the material.

Is any of this possible?

1 Like

That’s what people do with gradient texture for toon shading. If you have linear data, you can transform it into non linear data, either through a lut like gradient texture or a mathematical function.

To the specific question of “is it possible to interpolated data between vertices non-linearly”, the answer is no. The GPU can only interpolate data passed from the vertex shader to the fragment shader linearly.

If you want non-linear gradients, you can do that, but it means you have to do all of the interpolation manually. That means you need to know the barycentric coordinates of the triangle, and the set of data you’re trying to interpolate between.

For example if you wanted to do vertex colors across each triangle in a non-linear way, you’d probably want not use vertex colors at all and instead use a texture with each triangle UVed to the centers of the texels. Using the current linearly interpolated UV, and the texel size of the texture, you could infer the barycentric coordinates, and the texel UVs for the three vertices. You could then apply some kind of curve to the barycentric coordinate (like smoothstep) and use that modified coordinate to interpolate between the three color values you sample from the texture. That does require every triangle is UVed in a consistent way so you can properly infer those values.

Alternatively you could use a geometry shader (assuming you’re not on a mobile platform) and pass the barycentric coordinates and all 3 vertex colors to all 3 vertices, or encode the data in the vertices manually using multiple UV sets. Then do the same curve modification to the barycentrics coordinate.

What @neoshaman is talking about wouldn’t necessarily be per-triangle. But it is very common to apply some kind of power or smoothstep or some random quadradic curve to a value to change how it looks. But for something like per-triangle, usually the solution is to use a texture and paint the gradient you want.

2 Likes

Hmm. That’s helpful. If somewhat disappointing.

One use case I had in mind was to soften the edges in a low-poly model, by using this trick on the normals. So for example where two faces of a cube come together, the shared vertices would have an averaged normal, which with a standard shader doesn’t look all that great, as the normals get interpolated linearly across the entire face. But with the trick I was imagining, we could make the normals go quickly to the mean value (i.e. the face normal) not too far from the edge, stay at that value across the flat of the face, and then go quickly to the new edge normal as it gets close to the other side. The result would look like a box with the edges rounded off, without requiring extra polygons to chamfer the edge.

But, so be it — for this use case, it sounds like spending the extra polygons is easier than any other solution.

Another use case I was thinking about was faking ambient occlusion, by using vertex colors to darken the edge of a wall where it joins the ground (or whatever). Again we’d like that darkness to apply only near the ground, not linearly across the whole height of the wall. But in this case probably the right solution is to use a second texture map that defines that gradient. With a bit of cleverness that could be kept small (much smaller than the main texture).

Actually, all modern hardware do attributes interpolation manually, in a pixel shader. Unfortunately this functionality not exposed in any DX, nor in Vulkan. Maybe this is exposed on consoles, don’t know.

Blending work on shaders too, and it’s inaccessible too. Unfortunately HW/API developers are focused on gimmicky gigarays, not on real useful things.

AMD has been since GCN 1.0 (Radeon HD 7000 series, Xbox One, PS4, etc.), but they’re the only ones I know of. No mobile GPU does this that I’m aware of. Nor do any Nvidia GPUs. Certainly not Turing (RTX 2000 series) and I wouldn’t expect Ampere (RTX 3000 series) to either.