I am trying to draw specific objects to a render texture then display that texture in a scene with transparency around the objects. Some of the objects themselves will be semi-transparent and they will all be different colours.
I have a camera set up with a solid transparent black background that draws to a render texture. I then have a quad in the scene that shows the render texture. The actual use case is a bit more in depth but this setup demonstrates my problem.
The issue that I’m having is that the render texture background still has a black colour so semi-transparent pixels show a black colour.
Here is an example:
The circle on the left is the original object that is in the scene. The circle on the right is the render texture displayed on a quad. If you look close you can see that the circle on the right has a black outline from the semi-transparent pixels around the edge of the circle.
You need to use a premultiplied alpha shader for compositing a render texture back into the scene. Basically you need a transparent shader that uses Blend One OneMinusSrcAlpha for rendering the render texture into the scene.
However depending on how you’re rendering stuff into the render texture you may need to use a custom shader for that too. If it’s a UI renderer using the default material then it should work fine. Otherwise you need to use a shader that either uses Blend SrcAlpha OneMinusSrcAlpha, One OneMinusSrcAlpha or Blend One OneMinusSrcAlpha.
I have tried using various pre-multiplied shaders on both the objects that are being rendered to the texture and on the quad that is displaying the texture. Unfortunately that doesn’t seem to fix the problem. Maybe I am misunderstanding something.
I tried using both ‘Sprites/Default’ and ‘Universal Render Pipeline/Unlit’ with blending mode set to ‘Premultiply’. The results looked the same as the screenshots in the previous post. I also tried a custom shader that used the blend mode that you mentioned.
To explain my previous comment, for me a “premultipled alpha shader” needs to do two things, it needs to use Blend One OneMinusSrcAlpha and take as an input a premultipled source texture. All built in “premultiplied” shaders multiply the color by the alpha in the shader before output, meaning they use a traditional (ie: not-premultiplied) alpha texture as input. The whole reason why you’re getting the extra darkening in your use case is because a render texture is a premultiplied image, and all the built in shaders are multiplying the color by the alpha a second time, either in the shader or from the first argument of the Blend function (the SrcAlpha in Blend SrcAlpha OneMinusSrcAlpha means “multiply the color by the alpha”).
Using Blend One OneMinusSrcAlpha in a shader that otherwise takes a traditional alpha texture as its input has the benefit of rendering the correct alpha value into the render target’s alpha channel for later use in exactly what you’re trying to do. But you still need a custom shader that doesn’t multiply the color by the alpha.
Something like the one in this specific post:
(Don’t use the later ones, those are trying to fix an issue with inspector rendering.)