Blending two textures from different cameras

Hello! I’m having troubles rendering second camera’s texture upon other camera output.
Main camera renders directly on screen, second camera renders on a texture with such settings.
Camera settings

The thing is, even though the background color is set to all zeros RGBA (0, 0, 0, 0), the texture I get has black background and partly-transparent stuff gets blended with it. As a result, when I try to put RenderTexture above my main camera’s output - there are lots of unwanted black.

I get the desired look if I render second camera directly to display with higher depth, but then it’s renders on top of everything that is rendered by cameras with lower depth - which is unwanted effect.

Here’s a short recording of what I get: on it you can see that all particles get that black background, and the render texture is transparent only where particles are absent.

Anyone knows what am I missing to blend these two textures in a desired way, where particles will not get blended with black (or whatever color is set for the cameras BG)?

After few days of suffering, I’ve found the solution: to use the external alpha code from default Unity shader “Sprites-Default” on the semi-transparent particles I’m rendering to the texture.

		Blend One OneMinusSrcAlpha
		ColorMask RGBA

#pragma multi_compile _ ETC1_EXTERNAL_ALPHA

	struct appdata_t
{
	UNITY_VERTEX_INPUT_INSTANCE_ID
            //other stuff
};

struct v2f
{
	UNITY_VERTEX_OUTPUT_STEREO
            //other stuff
};

v2f vert (appdata_t v)
	{
		v2f o;
		UNITY_SETUP_INSTANCE_ID(v);
		UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(o);
             //other stuff
};

Or, if you don’t need anything custom, just use the Sprites - Default shader :slight_smile: