How can I take depth into account when rendering to texture?

Okay so basically I have an object with particles around it. I wanna apply a special image effect only on my particles (metaballs), so I put the particles on a separate layer, and render to texture from a separate camera that only sees this layer. I create a material that uses this render texture, and I apply it to a plane that rotates with my main camera. Everything works as expected, except for the fact that since I’m rendering to a plane, and the plane is closer to my camera than the actual scene, it gets drawn over my scene. How can I take into account the depth when I render to my texture so that my geometry is drawn over the particles? Or is my setup wrong? Any ideas?

What my scene looks like: http://i.imgur.com/mRCSC3U.png

What the game view looks like: http://i.imgur.com/McojxnE.png

Main camera settings: http://i.imgur.com/SEX5KtU.png

Blob camera settings: http://i.imgur.com/m36YL1D.png

You might need a third camera that is the same as your main camera. Have the new camera be the one that renders the render texture for the blobs and alter the depth of this new camera accordingly. You might be able to do this with the blob camera if the effect isn’t fullscreen.

Why do you draw the rendertexture to a plane in the scene? If the particles are on a separate layer with a separate camera, it would probably make more sense to apply the metaball material as an image effect in the OnRenderImage of that camera:

void OnRenderImage (RenderTexture source, RenderTexture destination)
	{
		Graphics.Blit (source, destination, material);
	}