Displacing vertex with HeightMap texture in vertex shader

I’m trying to use a HeightMap texture in a vertex shader to morph a flat terrain to have hills/valleys, etc.

I’m using a quadtree structure, and each quad is it’s own gameobject/mesh. I have a _HeightMap texture that that is passed to every quad using material.SetTexture. However, I see a repeated pattern, which I think means the texture is being scaled and then tiled on each quad, when I want the heightmap to ‘stretch’ over all quads. What’s the proper way to do this?

Here is my shader code:

Shader "Sagan/Proland" {

		   Pass {

			 #include "Assets/Resources/Shaders/Sagan.cginc"

			 #pragma vertex vert // vert function is the vertex shader
			 #pragma fragment frag // frag function is the fragment shader

			 float4x4 _TerrainMatrixWTL, _QuadPosition;
			 sampler2D _HeightMap;

			// vertex shader
			saganVertexOutput vert(appdata_base v) {
				saganVertexOutput v_out;

				float4 vertexPos = v.vertex;

				float textureHeightValue = tex2Dlod(_HeightMap, float4(vertexPos.xz, 0, 0)).r;
				vertexPos.y += textureHeightValue;

				v_out.position = mul(UNITY_MATRIX_MVP, vertexPos);
				v_out.color = color(vertexPos);

				return v_out;

			// fragment shader
			float4 frag(saganVertexOutput input) : COLOR {
			   return input.color;
				  // Here the fragment shader returns the "col" input
				  // parameter with semantic TEXCOORD0 as nameless
				  // output parameter with semantic COLOR.


Here is what I’m seeing:

The key to this lies in line 21:

float textureHeightValue = tex2Dlod(_HeightMap, float4(vertexPos.xz, 0, 0)).r;

Your texture position input is based on the object-space vertex position of your plane. The result is that you’re taking a 0-1 texture space and requesting the texture at a position over a range of ~12 units. It’s not terribly surprising that you would see numerous repeats.

Unfortunately, I haven’t come across any way of retrieving object bounds data inside a shader without passing it through manually. Therefore, you’ll want to pass one of the following into your shader:

A single scale value, applicable only to square planes - If the tile’s origin is guaranteed to be centered on the plane, a single value can potentially determine the offsets required to scale your texture coordinate range down to a (center - scale) to (center + scale) range (which can then be converted into a 0-1 range for application of the texture).

A more versatile alternative would involve passing a Vector to the shader instead - Take data from the bounds of the mesh and pass it to the shader as something like (min.x, max.x, min.y, max.y). Then, in the shader, you can generate a range from each minimum to maximum and, again, scale the values to a 0-1 range to properly fit each texture.

As for actually rescaling the number, you won’t have access to Mathf.InverseLerp, so you’ll have to make due with writing a variant of your own instead.