Rendering a separate alpha channel

I’ve got what I think is quite a niche situation here.

My Unity output is going to be blended on top of another video source. To achieve this, we are either looking at chroma keying my output, or preferably we want to use the left-right alpha packing method below, which uses the right frame as an alpha mask for the normal left frame.

4168060--368053--upload_2019-1-31_16-3-14.png

So, my unity output will look like the above, although most of my content is actually 2D in canvases, UI Images, TMP Texts etc.

So, to achieve this, I’ve tried rendering to a render texture and displaying that render texture with a custom shader.

I’m using the default UI shader with this crude modification:

fixed4 frag(v2f i) : SV_Target
{
    fixed4 texcol = tex2D(_MainTex, i.texcoord);
    texcol.r = 1;
    texcol.g = 1;
    texcol.b = 1;
    return texcol;
}

This mostly works great but the problem I have is possibly to do with compression of textures, I’m not sure.

When I have a png image like this:

4168060--368056--upload_2019-1-31_16-22-30.png

Rendered over solid colour, I get this outline:

4168060--368059--upload_2019-1-31_16-22-57.png

Any thoughts??

Cheers!

The triangle is not part of the issue, correct? That’s just a background image, the issue you’re not expecting is the black outlines?

You indeed should not be seeing this, and it has nothing to do with compression. PNG is a lossless compression, and the format Unity is using is showing above as RGBA 32 bit, which is uncompressed. (GPUs can’t use PNGs directly, they’re too slow to decompress and aren’t random access friendly.)

And the shader you present above is fine, if used to draw over something using Blend SrcAlpha OneMinusSrcAlpha it should not have any outlines. I assume it’s what you’re using to draw your UI elements into the render texture, yes?

The issue is how you’re displaying the render texture itself. While you may be rendering the alpha to the render texture properly, I suspect your render texture is using a clear color of (0,0,0,0), and then you’re using traditional alpha blending (Blend SrcAlpha OneMinusSrcAlpha) to render your elements into it. The problem is you’re also presumably using that same blend to display your render texture. This is a problem because the render texture needs to use premultiplied alpha blending (Blend One OneMinusSrcAlpha) because, well, the values it holds have been premultiplied when you rendered your stuff into it. You rendered your stuff using Blend SrcAlpha OneMinusSrcAlpha, which means the resulting RGB color value of 1,1,1 was stored in the render texture multiplied by the alpha (ScrAlpha), hence “premultiplied”. By using Blend SrcAlpha OneMinusSrcAlpha you’re effectively multiplying the color twice.

This article goes into a similar problem, and why it is a problem:

The easy solution is when rendering your render texture, use a shader with Blend One OneMinusSrcAlpha instead of Blend SrcAlpha OneMinusSrcAlpha.

Hey thanks @bgolus !

This is really informative. I can’t claim to understand it all but I’m trying to implement your solution.

My UI Image elements are being rendered with UI/Default which indeed uses Blend SrcAlpha OneMinusSrcAlpha and the custom Shader that renders the Render Texture also uses this Blend.

What I’m stuck on is separating rendering to the Render Texture and then rendering the Render Texture.

My first camera is rendering the UI elements to the Render Texture via its Target Texture, using selective Layers

Then the Render Texture is rendered by a second camera, with the custom Shader applied to its Material

I have previously been working on a custom Post Processing effect, which does the same thing as the above custom Shader. It yielded the same results but it wasn’t rendering to Display via a Render Texture, struggling to get that to work, I’m finding Post Processing effects a bit of a minefield.

For your above solution do I need to be Blitting the first camera with the above shader into a Render Texture and then rendering the Render Texture with Blend One OneMinusSrcAlpha via the second camera? At the moment there is only one point in the process where my custom shader is affecting the image and it’s at the point of rendering the Render Texture I think…

So I’ve gone down the route of Graphics.Blit using OnPostRender on the first camera to apply the custom Shader to the Render Texture before then rendering it via Raw Image with another custom Shader that sets Blend One OneMinusSrcAlpha.

Now I’m in all sorts of problems with the Render Texture just showing pure white or not clearing parts that become transparent.

I’m sure with some persistence I’ll get there but if anything jumps out at you about the above steps please let me know!

Thanks again!

Okay, I think I understand what you’re doing a little better, though I’m still not totally sure I understand the context in which you’re seeing the outlines.

However one issue you’re going to run into is the alpha channel you’re getting from that first camera isn’t going to be right. Normally when rendering transparent objects, you’re doing so into a framebuffer or some other render texture with other content already in it. The alpha output by the shader exists simply to handle the compositing of the shader’s output color with the target’s color. The values that actually end up in the alpha channel of the target don’t matter.

Using Blend SrcAlpha OneMinusSrcAlpha, aka traditional alpha blending, means the color values nicely blend and look as you would expect, but they also mean that the alpha value being recorded in the target buffer is multiplied by itself, which is bad. You really want to be using Blend One OneMinusSrcAlpha for the alpha channel to get properly stored.

To do that you can either use that blend mode and multiply the output color.rgb by the color.a in the shader, or use a separate alpha blend mode.

Blend SrcAlpha OneMinusSrcAlpha, One OneMinusSrcAlpha.

Also, you should be clearing that first camera’s target to a solid color of (0.0, 0.0, 0.0, 0.0), aka black w/ alpha zeroed out.

The part I’m not understanding is why you’re rendering that render texture again as solid white with only the alpha left as is. Originally I though it was because you were looking to do the compositing within Unity. Now I assume this is so your output from Unity has the color and b&w alpha side by side, yes? Why not then output the alpha as the color?

fixed texAlpha = tex2D(_MainTex, i.uv).a;
return fixed4(texAlpha, texAlpha, texAlpha, 1.0);

Depending on your projects color space settings, and how ffmpeg handles stuff, you may need to additionally apply a sRGB color space conversion to this output.

fixed4 color = fixed4(texAlpha, texAlpha, texAlpha, 1.0);
color.rgb = LinearToGammaSpace(Color.rgb);
return color;

Now, for ffmpeg, you’ll want to use the alpha=premultiplied setting to actually do the compositing.
https://video.stackexchange.com/questions/23242/ffmpeg-overlay-with-transparency-has-dark-outline

Thanks again for your response. I’m still finding it hard to figure out so I’m going to go back a bit, forgetting about the outlines for a moment…

So for this new test example I’m working to a final resolution of 1920 x 1080, which is two times 960 x 1080 areas side by side.

On the left is my original canvas with UI elements, for this example, all UI.Images with no sprites using the Default UI material. One large white square, then from top to bottom, solid white, semi opaque white, semi opaque black, solid black.

A camera is positioned to render the left area of the screen to a 960 x 1080 render texture, on the right is a Raw Image rendering the Render Texture.

So problem one, which you have spoken about, which I didn’t notice at first is indeed the alpha being wrong in the render texture. If I use a modification of the Default UI shader and add

color.rgb *= IN.color.a

in the frag or something similar in the vert function, nothing changes. If I change Blend to

Blend SrcAlpha OneMinusSrcAlpha, One OneMinusSrcAlpha

again nothing changes… I’ve tried a lot of variations and different shaders now to try and achieve this so firstly just figuring this one out would help me a lot, sorry I can’t manage to get your current explanations to work with my current knowledge…

Thanks again for your time!!

Darren

Oh my god the second I posted this I instead changed the blend on the material used for all the UI elements and everything started to fall into place.

Above is the output I need to achieve and am now achieving.

And the outlining is fixed with the blend mode change as well

Thanks for your help @bgolus !

1 Like