Hello,
I am using Unity 2021.3 to generate a “curved” mesh procedually, I have a 512 x 512 texture attached to a URP lit material, and set it to the renderer of the mesh.
Please see the attached screenshot, it seems the texture is repeated relatively correctly on the two sides of the mesh, but not on the top. I have rotated the orginal texture so that it is at least mapped correctly on the sides.
I think the UV shall be mapped in the reference of the orientation of the surface, but I don’t know who to write the code. Is there any trick that I can by pass the issue? I cannot make the mesh look like a triangle which may ease the problem to some extent.
Here’s the code for the UVs,
public Vector2[] CulculateUVs()
{
Vector2[] uvs = new Vector2[_meshVerticeArray.Length];
for (int i = 0; i < _meshVerticeArray.Length; i++)
{
uvs[i].x = _meshVerticeArray[i].y;
uvs[i].y = _meshVerticeArray[i].z;
}
return uvs;
}
There is no “bottom” of the mesh, all vertices are “from left to right, from bottom to top, from front to back”.
Y coordinates can’t be used to make UVs here because they don’t change in the top of the wall as in the side. In a shader you could “easily” do this with triplanar mapping, in a script it depends, are you bending the mesh before or after uv generation?
Hi algio_, yes I believe the triplanar mapping approach is promising as I just found it, although I don’t understand it.
I bend the mesh before uv generation, the mesh was created in the bend shape and then I culculate the uv.
Can you show me or share with me some clues about how to do this with the triplanar mapping approach in a shader or a script?
Thanks very much for the reply.
Texture acts like a cellophane sheet with an image that you can stretch.
On the top of the wall you’re basically using something like two pixel rows of the texture on entirety of the geometry. That results in the stretching effect.
Because you’re doing this:
You take this part of the texture:
And then apply it like this to the mesh:
You need different mapping for the texture
You could say triplanar mapping is based on the principles you are using here, the important difference is it projects the UV coords from three different (orthogonal) planes while you are doing it from one. You can do it using the triangle normal that tells you what direction the triangle is facing.
If you wanna learn about triplanar mapping you can follow this tutorial.
I think it’s good, just set col.a to 1 or use a texture without alpha.
The tutorial seems beyond me, but I’ll try to learn it as much as I can.
The color.alpha is already set to 1, and I used a jpg texture which has no alpha channel. But I’ll research more on this.
You should play with UV texcoordinates in a modeling software (like blender) till you understand how they work. That’ll explain you the stretching too. Your questions so far indicate that you haven’t done so.
Also, triplanar most likely will not work well with this sort of texture. You also do not really need it.
In this sort of scenario you should be applying UVs to the mesh before you bend the mesh. For U you’d be using curve position, for V you’d be using height adjusted so V doesn’t stretch. You could try using cylindrical projection for V too. Ideally you’d want to automatically unwrap both sides of wall segment and make them tileable in U direction.
If “this is beyond you”, then you should continue to learn it till you understand it.
Or look for some 3rd party asset that does what you need.
Oh I got what you meant it’s how is blended, it should improve increasing blend sharpness to a very high value 100-200. However, unless you write a custom triplanar shader this could not be enough.
So you could have to fix your script to take into account face normals or mesh generation parameters, in the first case you can modify your CalculateUVs function to scale U coord based on the normal, in the latter case you basically calculate UV coords in the same step the mesh vertices are generated, if you already have a clean parametrization this is the way to do.
Hello guys, thank you for your help. I’m trying to learn the shader way of doing it, but it seems the best result needs 2 - 3 textures. And also, I tried to culculate UV for each vertex right after each triangle is culculated when creating the mesh, somehow it doesn’t work, I’ll digging into it. Meanwhile, I also tried the following approach, it gives a better result but far from good. Would you please take a look at it and see if it is a right direction? The result is in the screenshot, thanks a lot.
public Vector2[] ExportSolidUVs()
{
Vector2[] uvs = new Vector2[_meshVertices.Length];
for (int i = 0; i < _meshVertices.Length; i++)
{
Vector3 normal = _meshVertices[i].normalized;
float x = Mathf.Abs(normal.x);
float y = Mathf.Abs(normal.y);
float z = Mathf.Abs(normal.z);
float max = Mathf.Max(x, Mathf.Max(y, z));
if (max == x)
{
uvs[i].x = _meshVertices[i].y;
uvs[i].y = _meshVertices[i].x;
}
else if (max == y)
{
uvs[i].x = _meshVertices[i].y;
uvs[i].y = _meshVertices[i].z;
}
else
{
uvs[i].x = _meshVertices[i].z;
uvs[i].y = _meshVertices[i].x;
}
}
return uvs;
}
This is not a bad idea, but you can’t apply this on vertex level.
Your mesh has faces where one vertex normal looks up (meaning max == y) and another vertex normal looks sideways (max == x or max == z). When you encounter those faces you’ll stretched textures
This approach will also produce artifacts in areas where your walls bend.
Basically either create separate geometry for wall sides or wall caps, and apply this algorithm to them independently (both walls and caps can be in the same mesh), or try to use cylindric projection on the mesh BEFORE you bend it.
I also strongly recommend to play with texture coordinates in blender till it clicks for you how they work. RIght now you do not seem to understand them fully.
It is also possible that you’re trying to replicate triplanar mapping using a single coordinate set and do not understand what you’re doing. This is not gonna work.
The reason why triplanar works is because it samples 3 different planar projections - each with different coordinates), and blends them within the pixel shader.
With just 1 coordinate set you will not get the same effect.
Hi @algio and @neginfinity , thank you guys so much for your further replies and the tips. I think I’ve found a way based on your advice of culculating the UVs before the mesh is done. And you’re right, triplanar mapping is not an ideal approach in this case, although it does give an ok result.
As shown in the screenshot, basically, while creating each vertex, I culculated the coordinate of the vertex on a 2D plane, imagining the surface of the mesh is unwarpped onto that 2D plane. The time each vertex is being created is the only chance to do this culculation when the pattern for positioning the vertex is not lost yet. With these coordinates, it is very easy to set the UVs.
But suppringly, these coordinates together form a wired shape, one side is like a curve. I thought it should be a rectangle, at least the left and the right sides should be similar, according to orginal references for creating the vertices and how the mesh looks.
It’s good to know why, but anyway I’m not gonna fix it since no serious visual issue is caused, and this UV problem has already delayed the tiny project.