# Get Mesh Data From Texture Positions

I am working on generating a texture that updates in real time based on a few different parameters. To accomplish this I am looking to find information about the mesh the texture is applied to and use that information to do my calculations. The issue I’ve run into is that there doesn’t appear to be a means to accomplish this.

I am looking to find the Normal and Position of the point on a model from texture coordinates but all I have been able to accomplish is to pull the raw data from the mesh (using the uv, normal and other properties of the `Mesh` object). This doesn’t accomplish what I am looking for as my algorithm needs complete coverage to work, that is, I can’t skip over a point on the texture.

Is there some way to efficiently get these values or am I going to need another solution?

Edit: Currently I am using a precalculation to figure this out. This calculation takes approximately a minute to run on my computer for a 16x16 texture. That is hardly optimal. While I could thread this I think that’d would just serve to hide a less than optimal solution. My code is as follows (WrappingGrid is a custom class for calculating nearby points based on a grid which wraps (like UV cords do)):

``````void Start () {
grid = new WrappingGrid(1, 1);
nodes = new MeshTextureNode[heatmap.width, heatmap.height];
for (int x = 0; x < heatmap.width; ++x)
{
for (int y = 0; y < heatmap.height; ++y)
{
float u = (float)x / (float)heatmap.width;
float v = (float)y / (float)heatmap.height;
int closest = 0;

for (int i = 0; i < mesh.uv.Length; ++i)
{
closest = CalculateClosest(u, v, closest, i);
}

nodes[x, y] = new MeshTextureNode()
{
normal = mesh.normals[closest],
localPosition = mesh.vertices[closest]
};
}
}
}

private int CalculateClosest(float u, float v, int closest, int i)
{
Vector2 thisDiff = mesh.uv*;*
``````

Vector2 closestDiff = mesh.uv[closest];
if (grid.NearerThanPoint(new Vector2(u, v), thisDiff, closestDiff)) closest = i;
return closest;
}

So if i got you right you want a method that takes a mesh, the texture for that mesh and a uv position in that texture and get back a world / local space position on the mesh and the points normal vector, right?

First of all you have to understand that the uv to world point projection is not 1 to 1 it’s a “1 to n” mapping where “n” even could be 0 (this is the case when that portion of the texture isn’t mapped to the mesh at all). “n” could also be greater than 1. This is the case when you have multiple triangles mapped to the same portion of the texture.

So naturally a method like this would return an array of position / normal pairs which can also contain 0 elements.

What you have to do is iterating through all triangles, get the corresponding uv coordinates of the 3 corners. Calculate the barycentric coordinate of that triangle from the given uv coordinates. use those coordinates to test if your point is inside this triangle. If it’s inside, just use the barycentric coordinates to interpolate the position and normal values of the 3 corners.

Finally if you need those positions / normals in worldspace you have to use transform.TransformPoint / TransformDirection of the transform that contains your mesh.

We had a similar question over here. There i only gather the position on the mesh. If you want a method that returns both, you should use a struct with two Vector3 (position and normal).

ps: I just fixed the code formatting on those answers. It seems that (once more) due to some migration of UA the code formatting got messed up and there were tons of `<`, `>` and `&` instead of <, > and &. Maybe i’ve missed some if you find something, please leave a comment.