How does Unity handle alpha sorting?

We've written an MOV material that plays a PNG compressed MOV file with alpha on a poly. It's pretty cool though a little memory intensive. Not a problem for this application.

Here's the problem/question:

Problem: I'm seeing alpha flutter between two polygons that overlap each other in the game camera. The distance between them is way more than it needs to be to avoid Z-Buffer artifacts so nope, it's not that.

Question: How does Unity decide which pixels of an alpha texture to display? Is it by object center or object bounding volume?

Here's what we're trying for: http://www.lindsaydigital.com/clients/MBAQ/webPlayer/test_006.html

One thing which may be helpful here is to use non-standart render queues in your shaders. If you replace

` Tags {"Queue" = "Transparent" }`

with

` Tags {"Queue" = "Transparent+1" }`,

that material will be rendered after all normal transparent materials.

The answer seems to be: "Bounding Volume".

I added a triangle to the object I wanted to be in front and moved it out towards the camera, making the bounding volume of the smaller, "front" object larger than the background object. It's a hack, but it works.

Spence.