Hello,
I’m building a 2D game using a character setup similar to Mika Mobile’s 2D/3D character rig. I’ve got a bunch of textured quads rigged and animated in Maya. Every thing is working in Unity. We’re using a perspective camera with low FOV placed about 200 units away from the character.
BUT, the only shaders that seem to work are transparent cutout shaders. This would be fine, except I’m getting terrible aliasing around the edges of my character. If I change my material to Transparent/Cutout/Soft Edge Unlit, the edges look great. But when I tested this on my android device (Galaxy S II), all my textures have some kind of crazy alpha thing going on, where I can “see through opaque objects”. It doesn’t look there’s some sort of transparency thing going on, rather it looks like multiple quads’ textures are being composited together
If I try anything other than a cutout shader, the Z order of the pieces of my character get all messed up (foot farthest from camera rendering in front of nearest foot, arm closest to camera rendering behind body and other arm).
Can anyone tell me:
- any reason why the transparent/cutout/soft edge unlit shader doesn’t work on android?
- why using any shader other than cutout shaders breaks the z order?
- suggestions on how to adjust my art pipeline to get some good looking characters…
Thanks in advance…
You HAVE to render your sprites back to front when transparency is involved, unless you use an AlphaTest shader (Cutout).
Either you:
Have Depth Writing/Testing on, and the transparent pixels are still written to the depth buffer, meaning that if an object is drawn afterwards in the drawing queue, it’ll think it’s behind something (the invisible parts of your quad) and not waste time drawing those pixels.
Or Depth Writing/Testing is off, and it just puts overlaps the images in the order you give them to the GPU, meaning the second image given will look like it was in front, even if it is positionally behind.
Note: Also looking at your camera setup, is your near clipping plane say, 199 units out? Otherwise you could get depth fighting if that Android device uses anything smaller than a 24bit depth buffer (most common type), as most of your available values for depth in the depth buffer are in completely unused space.
Perspective projections front load their Available Depths so 90% of your values are in the closest 10% of the clipping planes(meaning close objects have more available distinct depths in the depth buffer), so putting all your objects in close proximity at the back end of your clipping planes could cause issues. Orthographic has an even distribution of depth values from near to far.
This all has to do with how the GPU converts 3D (Game World) to 2D (Your Screen) as efficiently as possible.
If you are using SpriteManager or alike, you will need to make sure that the ordering of the sprites in the Drawing List is back to front for each batched group. And that the centers of each Batched Groups are back to front ordered as well. Afterwards Unity should be able to take care of ordering each Mesh properly.