Prioritizing Visibility of Meshes

I am building an environment that will have lots of different meshes to represent terrain. What I would like to be able to do is make it so that given an intersection of these meshes I could prioritize which ones are seen in the camera. So for example in the following images I have a large white flat plan, on top which there is a red mesh that is rendered from and .obj file. As you can see the red goes down underneath flat plane. As a result parts of the mesh are lost as they are covered up. I would like it to be rendered in a way that the red mesh is visible instead the white plane in the places. I’ve read a little bit on how to do this process, for example with rendering a weapon, but I also have circumstances where a third or even more meshes could layered on top. From the images. I would like to be able to see the green where there is green the green plane, red where there is red mesh but no green, and white visible only in the places where there is no other color mesh.

Images:

Side view:

Top view:

Thanks for any and all help,

Victor

So after a little bit of reading on the subject of how to control the order of rendering I determined that using Custom Shaders was the way to go. Because my particular materials for my project are texture mapped with colored images, I don’t have to worry about lighting therefore I was able to use the most basic Vertex and fragment shader described here: Unity - Manual: Custom shader fundamentals.

Then I used the Tags section to specify the Render Queue by setting the “Queue” tag to a value of “Geometry+n” where n grew with the precedence . NExt you must set the ZTest Always I then went to the material I was applying to the mesh and set the Shader to be the Custom Shader associated with the priority associated with.

The basic idea is setting that all the meshes Always show up (that’s the ZTest part) but then you also manipulate the order in which things are rendered to make it customizable. I used this method and it worked ideally for my reasons.

NOTE! This will cause weird results if you have many “lower priority” hills that cover up your “higher priority” meshes.

An example of the Shader I used is:

Shader "Custom/New Shader"{
	Properties {
		_MainTex ("Base (RGB)" , 2D) = "white" {}
	}
	SubShader{
		Tags{"Queue" = "Geometry+1"}
		Pass{
			ZTest Always
			CGPROGRAM
			
			#pragma vertex vert_img
			#pragma fragment frag
			
			#include "UnityCG.cginc"
			
			uniform sampler2D _MainTex;
			
			float4 frag(v2f_img i): COLOR {
				return tex2D(_MainTex, i.uv);
			}
			
			ENDCG
		}
	
	}
}