Render Texture from Cameras targetTexture produces seams.

I am attempting to render a specific section of my scene using a separate camera, and a render texture. That object is on a separate layer that the main camera is not rendering, but a separate camera is. The secondary camera has a target texture set to be a render texture that I have created. Everything is working as intended except for the fact that the object, when rendered to a texture, has a bunch of seams that are not present when rendering directly to the screen.

What it looks like when rendered directly to the screen:

What it looks like when rendered to a texture, and then displayed on a quad in the scene:

Notice how the second image has a bunch of transparent “lines” in between the sprites where there shouldn’t be any.

I am using a basic transparent shader to display the render texture on the quad (since the background isn’t part of the render texture, just the black crowd part). I have tried a number of different shaders, and none of them seem to make a difference.

The render texture’s settings are:
Width: Screen.width
Height: Screen.height
Format: RenderTextureFormat.ARGBFloat;

Unity Version: 5.2.3f1 - iOS Platform

Edit: The reason I am doing this is so that I can apply a “Blur” image effect to the texture, and make the crowd in the foreground appear to be out of focus. Any alternative suggestions for how to do this are also welcome.

Turns out that the shader I was using for my scene was using “Blend SrcAlpha OneMinusSrcAlpha” for some reason, when it should have been using “Blend One OneMinusSrcAlpha”. This was causing objects with alpha less than 1 to make the objects under them become semi-transparent as well exposing the camera’s clear colour background.