Shader not being drawn on iOS

Hello,

I’m trying to make a simple cutout interface shader (with no lighting) for my project. It uses a variable named _Cutoff to decide which pixels to draw, and it uses the texture’s alpha channel to compare with the cutoff variable. If it passes, then it draws with 1 as its alpha; if not, it draws nothing. I tried doing this with pure Shaderlab by using AlphaTest; it worked fine on the editor but not on iOS. Then I tried to write a simple fragment shader for it, as you can see below:

Shader "GUI/Cutout" {
	Properties {
		_MainTex ("Base (RGB) Trans (A)", 2D) = "white" {}
		_Cutoff ("Alpha cutoff", Range(0,1)) = 0.5
	}

	SubShader {
		
		Tags { "Queue" = "Transparent" }
		Blend SrcAlpha OneMinusSrcAlpha
		ColorMask RGBA
		Cull Off
		Lighting Off
		ZWrite Off
		ZTest Always
		Fog { Mode Off }
		
		Pass {
			CGPROGRAM
			#pragma vertex vert
			#pragma fragment frag
			#include "UnityCG.cginc"

			float _Cutoff;
			sampler2D _MainTex;
			float4 _MainTex_ST;

			struct v2f {
				float4 pos : SV_POSITION;
				float2 uv : TEXCOORD0;
			};

			v2f vert (appdata_base v)
			{
				v2f o;
				o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
				o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);
				return o;
			}

			half4 frag (v2f i) : COLOR
			{
				half4 texcol = tex2D (_MainTex, i.uv);
				if(texcol.a > _Cutoff) return half4(texcol.rgb, 1);
				else return half4(0, 0, 0, 0);
			}

			ENDCG
		}
	}
}

This works fine on the editor, but nothing at all is drawn on an iOS device. Any ideas of what I might be doing wrong? I’m using Unity 3.5.7.

Is the texture you’re using on the iOS device using alpha channel?

If the shader would fail to work, the object would be pink, else if the alpha channel is missing, the alpha would be 0 in the shader and so the object would be invisible.

Check the texture’s import settings on a mac.

You might have better luck with;

half4 texcol = tex2D (_MainTex, i.uv);
texcol.a = step(_Cutoff, texcol.a);
return texcol;

Slightly neater shader, and it might remove the if branching (technically step is an if, but I’m pretty sure it’s a function that’s built into the hardware so it’ll be faster than explicit if conditionals).

Would it not be cheaper to use Clip()? No idea if that works decently on iOS devices thou.

Clip throws away the pixel on the video card (best case scenario), worst case being it still performs the rest of the shader, but doesn’t output it. It’s perfect for cutout scenarios.

Clip takes a float value, anything equal to and below zero gets thrown out, all the rest is kept.

Alpha testing, and the more general clip()/discard are very expensive on iOS due to the deferred rendering architecture of PowerVR chips.

That’s cool to know, thanks!