Is it possible to have the Input structure of a Surface Shader contain the surface normal vector, AND have the SurfaceOutput set it’s normal to a value from a normal map lookup?
The reason I want to do this is because I am writing a shader that will fade the material based on the dot product of the view direction vector and surface normal, but I don’t want to use the normal from the normal map, because it can cause visible artifacts in the area that should be transparent. Rather, I want to use the surface normal before normal mapping because it won’t be as prone to having the artifacts show up.
Here is the relevant surface shading portion of the code the current Input structure:
struct Input {
float2 uv_MainTex;
float2 uv_GlossTex;
float2 uv_BumpMap;
float3 viewDir;
float3 worldRefl;
INTERNAL_DATA
};
void surf (Input IN, inout SurfaceOutputSpecColor o) {
half4 tex = tex2D(_MainTex, IN.uv_MainTex);
half4 gloss = tex2D(_GlossTex, IN.uv_GlossTex);
o.Albedo = tex.rgb * _Color.rgb;
//Gloss colour come from RGB
o.GlossColor = _Shininess * gloss.rgb;
//Specular is mapped
o.Specular = gloss.a;
o.Emission = _ReflStrength * texCUBE (_Cube, WorldReflectionVector (IN, o.Normal)).rgb;
o.Normal = UnpackNormal(tex2D(_BumpMap, IN.uv_BumpMap));
// here is where I want to use the original surface normal, not the texture lookup normal
half rim = 1.0 - saturate(dot (normalize(IN.viewDir), IN.o.Normal));
rim = lerp(_CenterAlpha, _EdgeAlpha, rim);
o.Alpha = pow(rim,_AlphaPower);
}
Ok, so I have been trying to figure this out all day. The problem isn’t that I can’t access the original normal value, but that when I try to use it and then write to the normal value with a new normal from a texture I have narrowed this issue down, but I have no idea why it is happening or what I can/should do to correct it. Below are some screen shots and surface shader code used to create each material. The object is just a sphere straddling a blue plane and the default Unity background.
First shot: This is using the default value of the normal (that is,the value initially contained in SurfaceOutput.Normal when the surf() function is called) to calculate the alpha, and not changing the value of the SurfaceOutput.Normal value. Note that the alpha fade is exactly what I am trying to achieve, however, the normal map isn’t being used at all.
Here is the second pic. This is using the normal from the normal map to calculate the alpha value, which results in non-transparent areas where I want it to be transparent:
And finally, here is the what happens when I try to use the original normal to calculate the alpha, and then write the normal from the normal map to SurfaceOutput.Normal. Note that the entire object is rendered opaque, and there seems to be pixellation as well:
And the code used. I should mention that the problem occurs no matter how the lines are rearranged, even if the original normal is saved to another variable and used after the normal map’s normal is saved to o.Normal.
Im a newbie at surface shaders as well, but one thing i know for sure is you can write your own custom vertex function to calculate anything you want in there and you can pass it to the surf later on.
If I move the alpha calculation from the surface shader function to the vertex function, I don’t think I will be able to get the view direction of the camera, will I? If I can, I think this would be possible, but I would still prefer it to be done in the surface shader.
I just don’t understand why this isn’t working! Why would using the interpolated normal then writing over it with a normal map lookup cause it not to work? Is this a Unity bug or am I missing something else here?
You need to use WorldNormalVector in the surface shader if you write to o.Normal. o.Normal defaults to 0,0,1 and is in tangent space, relative to vertex normal. See “Surface Shader Input Structure” section of this documentation page: Unity - Manual: Writing Surface Shaders