floating point division by zero at line warning in Unity 5.1

I’m getting ‘floating point division by zero’ error in all of my old Legacy Surface shaders with rim lighting at this line:
half rim = 1.0 - saturate(dot(normalize(IN.viewDir), o.Normal));

They were fine in Unity 5.0.1. Did something changed?

1 Like

It wasn’t fine before; warnings are displayed properly now (bugfix). You should update the shaders not to generate the warning, since they won’t necessarily work properly on all GPUs.

–Eric

yeah, but this code is from official page: Unity - Manual: Surface Shader examples see rim lighting example

Apparently that code hasn’t been updated/fixed yet.

–Eric

Well, seems like getting rid of normalize here fixes warning (without producing any visual difference):

half rim = 1.0 - saturate(dot(IN.viewDir, o.Normal));

That would imply that IN.viewDir was a zero vector, which it should never be, perhaps figure out why that’s happening before removing a normalise.

That is a temporary solution (to get rid of warnings), but I think viewDirection should never happen to be unnormalized in the first place

i’m getting the same error with the rim shader example from the docs, and i cannot seem to find a working solution.
how would i go about fixing this?

Since this is a compile time error I assume its not due to viewDirection being a zero vector, or any other vector being zero, but that the compiler cannot in some cases guarantee the input would never cause a divide by zero error and thus gives the warning.

If you check through Unity’s built-in shaders you’ll find all instances of cases where this would get a warning ( at least in standard shader code) now replace ‘normalize()’ with the following.

inline half3 Unity_SafeNormalize(half3 inVec)
{
    half dp3 = max(0.001f, dot(inVec, inVec));
    return inVec * rsqrt(dp3);
}

This function can be found in ‘UnityStandardBRDF.cginc’ and thus you can as of Unity 5.1 ( maybe earlier versions?) simply replace ‘Normalize()’ with ‘Unity_SafeNormalize()’.

6 Likes

nice, this works great!

I did that. It removed the warning. Great !

I then put back Normalize(), and the warning didn’t come back. I’m sure it’s the expected behavior… :stuck_out_tongue:

Similarly, the following line is now also generating a warning.

half2 screenUV = IN.screenPos.xy / IN.screenPos.w;

This line is a somewhat common practice in surface shader screen effects, like…

        struct Input
        {
            float2 uv_MainTex;
            float4 screenPos;
        };
        void surf (Input IN, inout SurfaceOutput o)
        {
            // Get the screen space position of the current pixel
            half2 screenUV = IN.screenPos.xy / IN.screenPos.w;

           ...
        }

Is there an alternative way of writing it? I’ve also tried ComputeScreenPos() and ComputeGrabScreenPos() from UnityCG.gcinc but the result is way off…

Or should I just do a max() like Unity_SafeNormalize() does in order to easy the compiler?

@Manny_Calavera : I know it’s been a while, but did you ever find a solution? If so, could you share?

Sorry, I haven’t found a solution. The warning is still there haunting me from time to time… :frowning:

A word of caution… I switched all of our shaders to Unity_SafeNormalize when this first cropped up but found it cross compiled to assorted platforms oddly. normalize on its own should be fine, every gpu probably has a normalize built in to it.

I have the same problem as @Manny_Calavera .

I see it is an old thread, but did you find any solution for this issue? I am receiving this exact warning on a shader. It is the exact line:

half2 screenUV = IN.screenPos.xy / IN.screenPos.w;