UV Mapping Procedural Sphere

I’m having what appears to be a common problem when first creating procedural spheres based of cubes, and that’s UV mapping. I’m getting the distorted seam where one side has the UV approaching 1 and the other side as UV 0. I’ve seen a few posts talk about this issue, with links to numerous sites talking about it but I’m still not able to find a coded answer:

In addition to the seem, it looks like the UV coordinates are also inverted.

For background, I’ve created the sphere using the normalized cube method and am sharing vertices along the face edges and corners. The current method of applying the UV is by using the formulas straight from the wiki on UV mapping UV mapping - Wikipedia.

with d being a vertex from a list of all the sphere vertices. Here’s the code that turns the cube into a sphere and applies the UVs, along with the UV converting function.

``````    for (int i = 0; i < sphereVertices.Length; i++) {

Vector3 v = sphereVertices [i];
v = v * 2f / gridSize;

float x2 = v.x * v.x;
float y2 = v.y * v.y;
float z2 = v.z * v.z;

Vector3 s;

s.x = v.x * Mathf.Sqrt (1f - y2 / 2f - z2 / 2f + y2 * z2 / 3f);
s.y = v.y * Mathf.Sqrt (1f - x2 / 2f - z2 / 2f + x2 * z2 / 3f);
s.z = v.z * Mathf.Sqrt (1f - x2 / 2f - y2 / 2f + x2 * y2 / 3f);

sphereNormals [i] = s;
sphereUV [i] = V3toUV (s);

sphereVertices [i] = s * sphereRadius;

}
``````
``````    public static Vector2 V3toUV(Vector3 p)
{
var d = p.normalized;
var u = Mathf.Atan2(-d.z, -d.x) * InversePI * 0.5f + 0.5f;
var v = 0.5f - Mathf.Asin(d.y) * InversePI;
return new Vector2(u, v);
}
``````

Any help is greatly appreciated.

It looks like you are projecting a rectangular texture on your sphericalised cube. If you want to take advantage of the mesh you should also use a texture which has a distinct part for every face. If you insist on using this texture try to duplicate the vertices along the seam and have one at u=0 and one at u=1. So the beginning and the end must overlap with distinct vertices. But this way the poles will still look crap.
Have a look at the tutorials of catlike coding. Jasper explains the mathematics pretty good.

Thanks for the feedback.

I’ve certainly referenced back to catlike coding on a number of occasions, but he uses a shader and submeshes to handle this issue, which I want to avoid if possible.

I’ve read about duplicating the vertices, hence why each face has it’s own vertices on the corners and edges so essentially i’m already doing that. But for the seam, does this mean I have to again duplicate those?

It’s a square texture, specifically a procedural heightmap based on noise functions.

Would you mind explaining this more? Are you referring to a tile map/atlas map approach?

It would definitely be better if you weld all the verticies.

The issue is probably caused because that transition happens within a face, so that on the left side of one face the UV is like 0.95 and on the right side it is 0 but should be 1. This at least happened to me as I did something similar. What you have to do is to make sure is that at the seam one vertex has the value 0 and the other has the value of 1. Talking about X usually.
It also makes sense because your method Mathf.Sqrt (1f - y2 / 2f - z2 / 2f + y2 * z2 / 3f) recieves the same result for the same vertex coordinate.

If your texture is flipped then maybe you have to tweak your algorithm and maybe just flip the result y=1-y.

Now you have the UV’s of the vertices mapped like one would use for a lat/long sphere. One broad texture and the density of the pixels is increased at the poles.
I would suggest that every face of the initial cube recives UV’s in the texture which do not necessarily are adjacent. Imagine an unwrapped cube. Or a texture 6 times the desired resolution broad. Then you assign appropriate UV coordinates to the vertices of the initial cube and let it subdive. Then there are no singularities (many texels close together) at the poles. And thus you have no visual artifacts there. But its a little harder to create those textures in the beginning since the mapping of the vertex position to the texture is more complicated. But if you don’t do it you can also use a simple lat long sphere.
This can be circumvented when you use a separate mesh for each face. Why do you hesitate when Jasper is doing it? You also gain a better vertex resolution this way and you can disable faces which cannot be seen.
Since this all depends on what you need the planets for maybe you can elaborate your requirements a bit. Shall the player walk on the planet? How large are they? Are they just backdrop?

When you already have distinc vertices for each cube face this should be enough. Maybe your calculations are off (index zero based). It looks like the whole texture is stretched into the seam so I would assume there is a sharp jump in the UV’s. So one vertex has 0 and its neighbour 1.
Make a raycast through the mouse on the sphere and let it Debug.Log the UV coordinate. Then zoom in and move the mouse over the seam.

Are you referring to a cubemap? I’m not opposed to using cubemaps, Spore implemented it very well, I’m just not sure how to code that out and I’ve been unable to find actual code examples.

Jasper used different submeshes which relies on separate materials, more draw calls, etc. It’s certainly a valid approach for what he was trying to do, but not for my particular application of it. You’ll notice that my vertex density is actually evenly spread. I utilized similar math to Jasper and other sources to achieve that. Also I do have a version of a sphere that is actually multiple meshes (patches/chunks/grids/etc), and not just limited to 6 meshes or 1 mesh per face. Even with this, however, the UV mapping looks the exact same with the same issue.

This is the heart of it. I feel like there is a calculation issue somewhere in that code I provided, just not sure where. I’ll try out the raycasting debug and post some more information.

Yes this is what I think is happening as well. I know you articulated how to fix it and I agree, but how does that translate to actual code?

Did the UV ray casting test and it confirmed what I initially thought that the UVs are approaching 1 on the left side of the seam, are at 1 on the seam, and the 0 on the right side of the seam.

Any other suggestions ?

As a minor update. I was able to resolve the inverted UV map rather easily by inverting d.y:

``````    public static Vector2 V3toUV(Vector3 p)
{
var d = p.normalized;
var u = Mathf.Atan2(-d.z, -d.x) * InversePI * 0.5f + 0.5f;
var v = 0.5f - Mathf.Asin(-d.y) * InversePI;
return new Vector2(u, v);
}
``````

Did nothing for the seam issue

Well, thats exactly the point of being a programmer. Translate something you want to model in a computer into code.
You created the code in the first place. Fixing the code for you would require to know all the code, your setup and quite a bit of time. I’m not sure someone is willing to spend this for no reason.
If the task is too hard for you why not do some tutorials first? If you follow catlike codings planet generation maybe you understand why he has no seam and can translate this into your code. And I’m pretty sure there are plenty of examples and projects doing exactly this on the internet.

By writing or debugging the code for you we have alot more effort and you have little to no effort. Helping does not mean “doing this for someone”. My daughter does her school homework right now. If she has a question I give her a hint. If I did the homework for her the task would be done but it helps her nothing in the next exam. So not writing a planet generator from scratch for you is not arrogance or lazyness but its simply not the meaning of a forum and not the meaning of asking for help either. At least from my point of view.

On the other side. If you can fix this yourself with the “hints” you get here and tutorials you end up being a better programmer. This will enable you to tackle harder problems. Debug faster. Write better and more efficient code. Someone who can do it already learns nothing new.