I have a number of planes with transparent textures on them arranged in a row but staggered depth-wise. The camera pans along them horizontally. The problem is, since the camera is so close to the planes, occasionally a plane thinks it’s closer to the camera than another because the plane’s pivot/anchor/origin is closer to the camera than its neighbour, even though it’s further back in z-space.
This wouldn’t be a problem, though, if the depth render order was testing against a point other than the camera – say, a modified point that was always camera.z-20. Is it possible to do something like that?
Age-old problem. It’s unsolvable with the scene you are currently describing and standard shaders.
If your shader is doing a radically different approach, such as Depth Peeling, it’s possible. I don’t know if anyone has ever made a Unity shader that does it, or its even possible with the pipeline exposed, but it’s an expensive approach depending on how many slices there are. Every slice = one render pass.
The best you can do to emulate Depth Peeling is to split your planes into smaller meshes, making sure the partitions will always go front to back at all angles. This only truly works if the planes are never meant to move, though.