Consider a super-simple scene with a “ground” box, another box on top, a capsule to the top-left and a player mesh in the middle:
https://dl.dropboxusercontent.com/u/136375/img/screens/unity-alpha.png
Each has their own material but ALL use the same hand-coded but ultimately simple bog-standard vert+frag shaderprogram.
That single-pass shader (pragma target 3.0 pragma glsl) outputs an .a of 1 for the fragcolor output. The shader is also RenderType=Opaque, Queue=Geometry, ColorMask=RGBA.
Now I have a second shader permutation exactly equivalent in all logic to the above (in fact both use the same include) but with a uniform-customizable .a alpha channel output for fragcolor, and with RenderType=Transparent, Queue=Transparent, ColorMask=RGBA, and Blend SrcAlpha OneMinusSrcAlpha.
Simple enough, right? Standard blending, the render queues see to it that all opaque geometry is rendered before transparent, and the normal blending mode should blend things together.
Now, who has ANY ideas whatsoever as to what could possibly cause Unity (forward rendering path with -force-opengl) to blend the player skinned-mesh with the transparent box, but NOT the ground box or the capsule that’s in fact intersecting the transparent box? I’m at a loss of logic even after reading and rereading the relevant doc sections. I should be doing everything right in terms of the queue and blend settings!