Procedural texture generation

I have a procedurally-generated green sphere that I am intending to use as a planet and I would like it to have a procedurally-generated texture based on the height of the landmasses.

Here is what my planet looks like so far (the water is another sphere):

As you can see, the landmasses are physically displaced outwards. I am trying to (somehow?) come up with a method to procedurally generate a texture based on the height and/or slope of the green sphere. Is a shader the right strategy, or something else?

I know I’ve seen people do this (notably Eric5h5’s Fractscape), but it was always for a terrain, not an arbitrary mesh like this.

That depends on what you want to show exactly. If you’re just looking for a static lightmap and/or altitude coloring, you could just render a texture offline at the same time as the sphere is generated.

I think this would be complicated to do with a shader. Unless you need it to be dynamic, I’d go for a generated texture.

You can still animate using various shaders and control their parameters.

It sounds like this is the way to go, as that is pretty much what I’m looking for. Unfortunately, I’m not sure how to dynamically render a texture. What would be the best way to go about accomplishing this? Specifically, how can I get altitude (and/or slope) coloring?

Thanks!

If you don’t need shadows and just want altitude colors, you could also try to use vertex colors instead of a texture. Assuming you generate the mesh in Unity using the Mesh class: check Mesh.colors and use a shader that can deal with vertex coloring. This will only work if you like how OpenGL interpolates the colors between vertices ;).

If you want a texture approach, it’s harder:
1- Obtain a mapping from (unit) sphere coordinates to texture uv. Assuming you already have uv on the sphere, you could use the same function you use for generating those. If you don’t, look here for a start: UV mapping - Wikipedia

2- Obtain a mapping function from height to color and perform a diffuse or raycast function to add lighting or shadow.

3- Sample the sphere and convert the local height and normal values using the functions from (2). The write the color value into the texture using the coordinate obtained with (1).

It may also be possible to use the Unity texture rendering functionality to have Unity perform all the shading for you.

Step 3 is probably easier if you have a function that maps from uv to unit sphere coordinates (so you can sample the texture instead of the sphere), but it’s too late for me to suggest a function that does that :P.

By way of illustration, ideally I’m looking to recreate this kind of effect (different colors/images applied to the mesh based on height):

…Only for arbitrary meshes, not just square terrain. Just sort of thinking out loud here, maybe I could loop through all the vertices and calculate their distances from the sphere’s interior and subtract the distance to the ocean mesh to determine height. Then I could figure out what image map (grass, rocks, snow, etc) would need to be applied to the geometry at that vertex position based on its height.

But there’s the tricky part; how do I get Unity to generate that texture? Furthermore, if I can procedurally create the texture at runtime, how do I relate a particular vertex to its X and Y position on the mesh’s UV map to figure out where on the 2D image to paint the appropriate color? Am I making any sense here? :roll:

Edit: Thanks tomvds, you were too fast for me! I did the UV mapping in my 3D program; should I do it in Unity instead? I’m a little confused about your step 3. What particular function do I use to “write the color value into the texture”? Is there a built-in function for this or is this going to be a custom affair? If I knew how to do this or what approach to take, I could do a lot of experimentation. Thanks so much!

Alright, I sort of figured it out! I went with tomvds’ vertex coloring idea. Here’s my globe right now:

In case anyone’s interested, basically, for each vertex in the completed land mesh, I do this:

thisVectorDistance = Vector3.Distance(origin, vertices[i]);
if (thisVectorDistance > 1){colors[i] = Color(0.95, 1.00, 0.64);}
if (thisVectorDistance > 1.01){colors[i] = Color(0.067, 0.31, 0.0);}
if (thisVectorDistance > 1.02){colors[i] = Color(0.38, 0.59, 0.0);}
if (thisVectorDistance > 1.034){colors[i] = Color(0.3, 0.3, 0.3);}
if (thisVectorDistance > 1.05){colors[i] = Color.white;}

Then I applied a shader to the mesh that shows vertex colors, courtesy of Aras (http://forum.unity3d.com/viewtopic.php?t=12031&highlight=vertex+colors):

Shader "!Debug/Vertex color" { 
SubShader { 
    Pass { 
        Fog { Mode Off } 
CGPROGRAM 
#pragma vertex vert 

// vertex input: position, color 
struct appdata { 
    float4 vertex; 
    float4 color; 
}; 

struct v2f { 
    float4 pos : POSITION; 
    float4 color : COLOR; 
}; 
v2f vert (appdata v) { 
    v2f o; 
    o.pos = mul( glstate.matrix.mvp, v.vertex ); 
    o.color = v.color; 
    return o; 
} 
ENDCG 
    } 
} 
}

However, as you probably noticed, the globe is really, really ugly! I think I’d like to go the texture route. Looking at the Texture2D class, it looks like I can literally determine what color each pixel has. However, there’s the obvious problem of translating the 3D vertices to their corresponding UVs on the 2D texture. How can I do this?

Thanks again for everything!

Try this (untested) C# code to calculate uv:

Vector3 sphereVertex; // a vertex on the sphere

Vector3 unitVector = sphereVertex.normalized;
Vector2 uv;
uv.x = (Mathf.Atan2(unitVector.x, unitVector.z) + Mathf.PI) / Mathf.PI / 2.0f;
uv.y = (Mathf.Acos(unitVector.y) + Mathf.PI) / Mathf.PI - 1.0f;

// uv should now contain corresponding texture coordinates for the sphere

You can test if it’s correct by loading the uv values for every vertex into the Mesh.uv array. If you then add some image as texture to the material, it should wrap around the sphere. There will probably be artifacts where the sides of the image meet. You will need to duplicate vertices along the seam in order to prevent these.

As I said in the other post, you’ll probably want the inverse of this function for generating the texture pixels.

Thanks, tom! I really appreciate all your assistance. However, I can’t seem to get your code working. Here’s what I’m doing:

var mesh : Mesh = GetComponent(MeshFilter).mesh;
var vertices = mesh.vertices;
var uvs = new Vector2[vertices.Length];

for (var i=0;i<vertices.Length;i++)
{
            var unitVector = vertices[i].normalized;
            var uv = Vector2(0.0, 0.0);
            uv.x = (Mathf.Atan2(unitVector.x, unitVector.z) + Mathf.PI) / Mathf.PI / 2.0;
            uv.y = (Mathf.Acos(unitVector.y) + Mathf.PI) / Mathf.PI - 1.0;
            uvs[i] = Vector2(uv.x, uv.y);
}
mesh.uv = uvs;

I slapped an image onto the material, but it never shows up; the mesh is painted only with the main color. What am I doing wrong?

It looks correct at first glance. You could check for the following:

  • print the values in mesh.uv after assigning it. Every element should be a different (x,y) pair, if they are you should at least see some texture mapping. x and y should both be values between 0 and 1.
  • check the shader. make sure it’s something simple like the diffuse shader. make sure there is lighting if your shader needs that. make sure the shader isn’t generating errors (like the “shader needs normals” error).

Your code was correct, I had made a mistake. Earlier in the script I has a line that assigned a new blank texture that was blowing away the one I assigned in the GUI. Your code works great! In general, the result is good, although there’s this ugly seam:

The pinching at the top and bottom isn’t a big deal because I’m just going to cover them up with polar ice caps, but that seam has got to go! Any ideas? Thanks again!

Also, on more request, if I may. I’ve noticed that the UV coordinates that your code produces are quite “gross”, they don’t even seem to use the hundreds place:

This has resulted in quite an imprecise UV map; is there any way to tweak the code to produce a UV map with values that have a few more decimal palces?

Thanks so much, tomvds. You have helped me a truly enormous amount, and I am extremely thankful!

This is the seam I predicted :stuck_out_tongue:

Basically, the texture is wrapped around the sphere until it reaches where the mapping began. At these last triangles, the texture coordinates jump from 1 back to 0, mapping the entire texture backwards in these triangles.

The solution is to duplicate the starting vertices where (uv.x is almost 0), add 1 to the texture coordinates of the duplicates and connect the end vertices to these copies instead of the original vertices. Make sure the texture is set to repeat and not clamp.

About the low precision values: Vector3.ToString() writes the float values with low precision. Simply printing with Debug.Log(uv.x + “,” + uv.y) will show the full precision.

Could you please post the latest shader you used with tomvds UV script?

How do you apply the colors once the UV is calculated? For each vertice?
Cause, I’m getting just the main color, maybe the shader isn’t correct.

Argh, it’s problem of the shader, when I use a particle shader (additive) it works! But… a transparent planet… omg, what shader did you use?

Help me please!! You will be my hero.

Edit.: woah, I got it!, I’m a hero!
If anyone is interested:
(http://www.unifycommunity.com/wiki/index.php?title=AlphaVertexColor)
And if you have other effects around the planet, just comment the line “ZWrite off” of the shader.

Hiya, i made a procedural texture that uses sine maths, you have to define curves that fade in and out in your first color map, and you could make extra lines and textures to add smaller brushes on top of themain colors. I thought my method would be easy to make bright blues, yellows, reds, etc, but it converges to white too easily.

http://www.unifycommunity.com/wiki/index.php?title=Animated_Color_Procedural_Texture

Hi there,

I’m very interested in the final solution for handling the sphere procedural texture issue as discussed on this thread. The problem makes perfect sense, and I understand the solution, but have no idea how to implement it.

My best guess at a solution is something like

  1. iterate through all triangles in mesh
  2. for each triangle, compare each vertex to each other, and identify any with a UV coordinate different of greater than say 0.5
  3. where such a triangle is found, for each vertex where the UV is close to 0, add that vertex to the vertex array, and alter the triangle you’re in to point to that new vertex.

Something like that?

Anyone care to share their solution? I can’t think of any other way of telling how we’re reaching the end of the sphere / set of vertices where everything wraps around.