Shader Works on Win, Breaks on Mac

I don’t currently have a Mac handy to test on, but one of my colleagues does and he’s sending back screenshots which look like this:

If you check the grey screen on the right hand side, you can clearly see the dither pattern. It’s not present on Windows.

I’m using the following shader:

Shader "Custom/UnlitAlphaMask" {

    Properties {
        _Color ("Main Color", Color) = (1,1,1,1)
        _MainTex ("Base (RGB)", 2D) = "white" {}
		_Cutoff ("Alpha cutoff", Range(0,1)) = 0.5
    }
    Category {
	   Alphatest Greater [_Cutoff]
	   AlphaToMask True
	   ColorMask RGB
       Lighting Off
       ZWrite On
       Cull Back
       SubShader {
            Pass {
               SetTexture [_MainTex] {
                    constantColor [_Color]
                    Combine texture * constant, texture * constant
                 }
            }
        }
    }
}

It’s just supposed to create a simple fullbright effect with alpha masking. I’m probably missing something important in the shader, but haven’t played with shaders much in Unity yet and I’m not sure where I’m going wrong.

Looks like crappy nVidia’s AlphaToCoverage

Thanks for the explanation, BAA. I didn’t realize that alpha-to-coverage was a problem on MacOS, but a forum search does seem to indicate that this is currently only safe to use on Windows, so I think it’s probably best to disable it, unless there is some way to compile different shaders for each OS?

I’ll check back with my colleague to make sure that this fixes it, but hopefully that will do it.

Thanks again.