I’m trying to write a shader to apply a gradient depending on the object’s height.
This is what I have:
Shader "Effects/Gradient" {
Properties{
_TintYp("Tint Y+" ,Color) = (1,1,1,1)
_TintYn("Tint Y-" ,Color) = (1,1,1,1)
_Height("Height" ,Float) = 3
_MainTex("Albedo (RGB)",2D) = "white" {}
_Color("Main Color" ,Color) = (1,1,1,1)
}
SubShader{
Tags{ "RenderType" = "Opaque" }
LOD 200
CGPROGRAM
#pragma surface surf Standard fullforwardshadows
#pragma vertex vert
#pragma target 3.0
struct Input {
float2 uv_MainTex;
float dy;
};
float4 _TintYp;
float4 _TintYn;
float _Height;
sampler2D _MainTex;
float4 _Color;
void vert(inout appdata_full v, out Input o) {
UNITY_INITIALIZE_OUTPUT(Input, o);
float dy = (v.vertex.y) / _Height;
if (dy < -1) {
dy = -1;
} else if (dy > 1) {
dy = 1;
}
o.dy = dy*0.5 + 0.5;
}
void surf(Input IN, inout SurfaceOutputStandard o) {
half4 color;
float4 ty = lerp(_TintYn,_TintYp,IN.dy);
color = tex2D(_MainTex, IN.uv_MainTex) * ty *_Color;
o.Albedo = color;
o.Alpha = color.a;
}
ENDCG
}
FallBack "Diffuse"
}
As far as I know, there’s no way to get the object’s height directly from the cg code, so I’m setting it to a fixed value, but that’s not my main problem right now.
The main problem is that the shader behaves differently depending on the position and rotation of the camera, which should not happen. As you can see v.vertex.y is what drives the gradient process and since it reports the Y position in object space there’s no reason why it should be affected by the camera.
This is what I see in the editor most of the time:
However at certain angles I see this:
This only occurs at very specific angles, if you move or rotate the camera just a little bit, it gets back to normal… I believe the shader code is correct, but not 100% sure.
Any ideas?

