Stretching UV...

Hi. I’m working on procedural world and have this problem when trying apply coastline texture…
I use black and white masks (one mask for one hexagon) for coastline which I multiply with coast texture. The greater the height of the coast, the more mask uv is stretched

Is there a way to fix it without changing uv before runtime? I use uniform uv and don’t want to change it…

Triplanar texturing is usually the solution.

As I understand it used when we need to fix tiling in places when uv is stretching. In my case I don’t use tiling for black and white mask. I just rotating this texture and applying to edges of hexagon…

Your concern is this section here, yes?

The reason to use triplanar texturing is specifically to fix stretching. Most of the time it’s also used with world space tiling, but it’s not a requirement. In this case you would probably need to basically use a “biplanar” mapping where the mask is applied from the top and side, or have the mask be a simple smooth gradient and have a triplanar noise that is modulated by it to create the mask.

Yes, thank you, seems biplanar should work in my case. I’ve tried to do something like this but using expression based on y value instead side texture. I’ll make some experiments with this technique and show results if somebody wants to see it))

So, I’m trying use xy and zy for covering. But mixing both of variants not working good…

xy

zy

mixing by worldNormals

I’ve tried different operations with normals, now I use
float3 normal = saturate(pow(IN.wNormal * 10, 20));
and
o.Albedo = xyTex.r * normal.z + zyTex.r * normal.x;

Is it real to mix it that result was more clear for masks?

When you do normal blending you can’t just use the normals straight. Normalized normal vector is a unit length vector, which means their magnitude is 1. That does not mean if you add them together they equal one, which when blending is what you care about!

The solution is to divide the normals by the sum of the components so they do equal one when added together. And in your case only divide by the sum of x and z since you’re not using y so:

float2 blendNormal = saturate(pow(IN.wNormal.xz * 10, 20));
blendNormal /= max(0.00001f, blendNormal.x + blendNormal.y);
o.Albedo = xyTex.r * blendNormal.x + zyTex.r * blendNormal.y;

Thank you but with this variant I still have blurred result))

You’re going to have a little bit of blurring, but yeah, it can be better.

float2 blendNormal = saturate(pow(normalize(IN.wNormal.xz) * 10, 20));

Another trick is to use kind of blurry looking textures, sometimes called distance fields or signed distance fields, and sharpen them in the shader. The reason to do that is you can more easily blend two blurry textures together and get relatively sharp results afterwards.

Ok, I made two distance textures and separately I can do them sharpen by something like this:
res = smoothstep(0.5 - threshold, 0.5 + threshold, tex);

That’s result:

But together they looks not better then earlier))
I’ve tried blend with normals and use simple addition and after apply smoothstep

Best that I can create (without distance fields) is smooth textures blending by normals, but it’s not exactly what I want)

Maybe if mix it with some noise I’ll get good result like here

So far my experiments don’t give nothing))