How can you make a Sprite Shader that properly uses the z coordinate?

Right now I am using a shader that is exactly the same as the default “Sprites/Default” shader, with the exception of ZWrite being On instead of Off. This change makes the shader accomplish most of what I want it to do, but unfortunately it causes a strange rendering problem:

As you can see in the above clip, some of the tiles will cause the tiles behind them not to be rendered (look at the flowers and trees). Whether or not this occurs changes based on the camera angle, so I’m demonstrating this by swapping between 2D and 3D mode, which moves the camera.

I imagine this has something to do with culling, but I’m not familiar enough with shaders to know how to fix it. I’m not sure if it makes any difference, but all the tiles are drawn with MeshRenderers, not SpriteRenderers. Any ideas would be appreciated. Thank you!

Well, with ZWrite on you prevent other things to be written underneath. And since your shader is probably not depth-sorted (usually enabled by queue=transparent), sometimes a quad is being rendered underneath your already rendered sprite, but pixels being discarded due to depth test fail.

Since you want a depth information, you want to discard pixels that are invisible. So why not use some alpha cutout shader? Is there a special reason why do you want to use sprite shader?
You also might want to consider using alpha-to-coverage to give the cutout edges some MSAA (but from the pixelated art style I am wondering if aliasing isn’t desired :slight_smile: )