Shader On Entire Object Instead of Faces

If you can, post your shader code (using code tags ) and we’ll be able to help out more.

But the main thing is you’re probably using a Surface Shader and driving the color “edge” using the IN.uv_MainTex or similar value. A mesh’s UVs, or texture coordinates, are used to map a 2D space onto the 3D surface of the mesh, but there’s no guaranteed correlation between the mesh’s position in the UV as the UV coordinate can be completely arbitrary. If you want to make across a mesh’s 3D space then you need to be working with 3D coordinates. The easiest to access when using a Surface Shader is the world position, which Unity will provide to the surf function if you add float3 worldPos; to the input struct. However that gets you the world position of each pixel that is rendered, but not the relative position within the mesh. So if you use the world position you’d need to transform the world position back into object space (aka model space, or the untransformed mesh space), or manually set a lot of data on the material to define where in world space the start and end positions for the transition are. Alternatively you could use a custom vertex function and input struct value to pass the original object space vertex positions, which will result in the surf function getting the object space position in that struct value.

struct Input
{
    // ... other stuff
    float3 worldPos; // special variable name the shader auto fills
};

void surf (Input IN, inout SurfaceOutputStandard o)
{
    float3 objectSpacePos = mul(unity_WorldToObject, float4(IN.worldPos, 1.0));
    ...
#pragma surface surf Standard vertex:vert

struct Input
{
    // ... other stuff
    float3 objectPos; // custom value
};

void vert (inout appdata_full v, out Input o)
{
    o.objectPos = v.vertex.xyz; // accessible as IN.objectPos in the surf function
}

However, be wary that even if you do have object space positions, that doesn’t mean you have a nice 0.0 to 1.0 value you can drive from. UVs tend to be in that nice range just because texcoords are a normalized range where 0.0, 0.0 is the bottom left corner of the texture and 1.0, 1.0 is the top right. Mesh vertex positions are also arbitrarily positioned relative to the mesh’s pivot. Additionally at no stage do shaders ever have access to the mesh’s full vertex data, generally only 1, or in special cases 3 vertices at a time. This means the shader doesn’t know the extents of the mesh. i.e.: it doesn’t know where the top and the bottom of the mesh are. So you still need to set that manually on the mesh itself, or use meshes that fit to the range you want your shader to use. But at least multiple copies of the same mesh won’t need separate world space coordinates.

One last gotcha. Unity uses mesh batching by default. Game objects marked as static will use static batching, and multiple simple meshes using the same material may get dynamically batched. What this means is Unity combines multiple meshes into one single mesh to make them faster to render. This also pre-applies their object to world transforms so as far as the shader is concerned the “object space” position and “world space” position are identical for batched meshes. This messes with everything I’ve written above. One option is to use unique materials on each object, or to disable batching for the project, or on the shader itself using the “DisableBatching”=“True” tag. However there’s one other option that skips all of the above, and doesn’t require any manual material property setting at all. Use UVs.

This might seem counter intuitive consider at the start of this I said UVs were your problem, but you’re using the UVs on an existing mesh which has UVs setup to be used with a texture. I also said UVs are arbitrary. So the trick is instead to use custom meshes with an additional UV set that simply starts at the bottom of your mesh and ends at the top (or vice versa, or whatever arbitrary orientation you want). Then in the shader you use that uv instead of the main UV or the vertex position. For UVs not tied to a texture property, Surface shaders will get very upset, so you’ll still need to use a custom Input value and vertex function.

struct Input
{
    // ... other stuff
    float myValue;
};

void vert(inout appdata_full v, out Input o)
{
    o.myValue = v.texcoord1.x; // texcoord is zero based, so texcoord1 == the second UV set
}

Know that the second UV set is used for lightmap coordinates by Unity’s shaders, so if this is a static game object you’ll want to use the third (also used for dynamic light mapping) or fourth (used by nothing built in) UV set. The mesh UVs are also technically a float4 value that Unity defaults to only using the first two components of, so you could use the third component to store your value if you do it from a script. Unity will not import anything but the first two components from meshes read from .fbx or .blend files.