Shadows have a mind of their own

I’m trying to make a shader that receives shadows and selectively outputs transparency based on the texture.

Even though my shader works, when I output transparent fragments Unity is still applying a translucent shadow in 3d space, and is using the geometry to occlude any background shadows based on the current viewing angle. This can be seen in the attached screenshot. Note the following:

  • Even though the female figure is transparent (the floor and cube can be seen behind it), there is a shadow being applied to the model.
  • The male character’s shadow and the cube’s shadow are occluded by her hip and hand, respectively.

[26962-screen+shot+2014-05-27+at+11.53.58+am.png|26962]

Here’s the subshader header. The contents of the shader don’t matter because even if I change the shader to simply output fixed4(0,0,0,0) I still get this behavior. If I change the queue to Transparent then the behavior stops but causes SHADOW_ATTENUATION to always return 1, making my shader useless.

SubShader {
   Tags {"Queue" = "Geometry" "RenderType" = "Opaque"}
      Pass {
         ZWrite Off
         Blend SrcAlpha OneMinusSrcAlpha 

Now, I can live with the shadow occlusion (even though I would like to understand why it happens) but the extra automatic pass to apply shadows to my model is something I would like to remove, as it feels utterly pointless. Can I indicate somehow to Unity that I am handling shadows myself and can you please leave my poor mesh alone?

Thanks in advance for any advice.

This answer is partial, but I hope it puts you on the right path.

The basic idea of how realtime shadows work is that each shadow casting light renders a depth buffer from the “view” of the light. Because it’s a depth buffer, only opaque objects can go into it (i.e. only opaque objects can cast shadows). During normal rendering, the shader takes the current pixel position, and computes the depth of that position from the point of view of the light, and then checks the depth of that position (in light space) with the corresponding shadow depth buffer pixel. If the current pixel depth is greater than the shadow depth buffer pixel, then the pixel is shadowed. If it’s less, then it’s not shadowed.

While only opaque objects can cast shadows, it should be possible for transparent objects to read shadows. That means that putting your object into the transparent pass (which is where it should go because it is outputting transparency) should be able to get shadow attenuation values other than one. The fact that you always get 1 is the central problem, and what you should try to figure out. If you are using deferred rendering, you should try forward rendering. That might force Unity to do the right thing for transparent shadows. If that doesn’t work, do some research on transparent shadow receivers in Unity. You should be able to find a solution (I hope).