Blending two cameras with Image Effects and a RenderTexture

Hi!

I’m trying to figure out a way of having my UI on another camera in order to exclude it from being affected by some image effects while allowing it to be affected by others.The whole thing works, but there are some issues with alpha of some elements “overriding” underlying objects alpha.

This is how it should look (screenshot from scene view):

This is how it looks in Game View:

Here is the script (to be placed on the Main Camera)

using UnityEngine;
using System.Collections;

namespace ImageEffects
{
    [ExecuteInEditMode]
    [RequireComponent(typeof(Camera))]
    public class WorldUiBlend : MonoBehaviour
    {
        [Range(0, 1)]
        [SerializeField]
        float alpha = 1.0f;
        [SerializeField]
        RenderTextureFormat renderTextureFormat = RenderTextureFormat.Default;
        [SerializeField]
        LayerMask mask;
        [SerializeField]
        Shader shader;
        protected Material _material;
        Camera worldUiCamera;
        RenderTexture renderTexture;

        protected virtual void Start()
        {
            if (!SystemInfo.supportsImageEffects || shader == null || !shader.isSupported)
            {
                enabled = false;
                return;
            }

            if (CameraExists () && Application.isPlaying)
            {
                UpdateCamera ();
            }

            else if (!CameraExists())
            {
                CreateCamera();
            }
        
        }

        void MakeNewRenderTexture(int width, int height)
        {
            if (renderTexture != null)
            {
                if (Application.isPlaying)
                {
                    Destroy(renderTexture);
                }

                #if UNITY_EDITOR
                else
                {
                    DestroyImmediate(renderTexture);
                    Resources.UnloadUnusedAssets();
                }
                #endif
            }

         
            renderTexture = new RenderTexture(width, height, 32, renderTextureFormat);
           
            worldUiCamera.targetTexture = renderTexture;
        }

        void OnDisable()
        {
            if (renderTexture != null)
            {
                DestroyImmediate(renderTexture);
            }
        }

        bool CameraExists()
        {

            bool exist = false;
            foreach (Transform child in transform)
            {
                if (child.gameObject.name == "WorldUiCamera")
                {
                    worldUiCamera = child.gameObject.GetComponent<Camera>();
                    exist = true;
                }
            }

            return exist;
        }

        void CreateCamera()
        {
            var masterCamera = GetComponent<Camera>();
            var go = new GameObject("WorldUiCamera", typeof(Camera));
            worldUiCamera = go.GetComponent<Camera>();
            worldUiCamera.CopyFrom(masterCamera);
            worldUiCamera.farClipPlane = 1000;
            worldUiCamera.nearClipPlane = 0.19f;
            worldUiCamera.cullingMask = mask;
            worldUiCamera.hdr = false;
            worldUiCamera.depth = masterCamera.depth - 1;
            worldUiCamera.clearFlags = CameraClearFlags.SolidColor;
            worldUiCamera.backgroundColor = new Color(0, 0, 0, 0);
            worldUiCamera.transform.parent = transform;

            MakeNewRenderTexture(Screen.width, Screen.height);
        }

        void UpdateCamera()
        {
            var masterCamera = this.GetComponent<Camera>();

            worldUiCamera.CopyFrom (masterCamera);
            worldUiCamera.farClipPlane = 10000;
            worldUiCamera.nearClipPlane = 0.19f;
            worldUiCamera.cullingMask = mask;
            worldUiCamera.hdr = false;
            worldUiCamera.depth = masterCamera.depth - 1;
            worldUiCamera.clearFlags = CameraClearFlags.SolidColor;
            worldUiCamera.backgroundColor = new Color(0, 0, 0, 0);
            worldUiCamera.transform.parent = transform;
           
            MakeNewRenderTexture(Screen.width, Screen.height);
        }

        protected Material Material
        {
            get
            {
                if (_material == null)
                {
                    _material = new Material(shader);
                    _material.hideFlags = HideFlags.HideAndDontSave;
                }

                return _material;
            }
        }



        void OnRenderImage(RenderTexture source, RenderTexture destination)
        {
            Material.SetFloat("_Intensity", alpha);
            Material.SetTexture("_Overlay", renderTexture);
            Graphics.Blit(source, destination, Material);

            if (Application.isPlaying && (renderTexture.width != source.width || renderTexture.height != source.height))
            {
                MakeNewRenderTexture(source.width, source.height);
            }

            else if (!Application.isPlaying)
            {
                MakeNewRenderTexture(source.width, source.height);
            }

  
        }

    }
}

And here is the shader I currently use:

Shader "Hidden/Parabole/ScreenBlend" {
    Properties {
        _MainTex ("Screen Blended", 2D) = "white" {}
        _Overlay ("Overlay", 2D) = "white" {}
        _Intensity ("Amount", Range(0.0, 1.0)) = 1.0
    }
   
SubShader {
        Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent" }
        Cull Off
        ZWrite Off
        ZTest Always
        Blend SrcAlpha OneMinusSrcAlpha

        Pass {
            CGPROGRAM
                #pragma vertex vert
                #pragma fragment frag
                #pragma fragmentoption ARB_fog_exp2
                #pragma fragmentoption ARB_precision_hint_fastest
                #include "UnityCG.cginc"
              
                struct appdata_tiny {
                    float4 vertex : POSITION;
                    float4 texcoord : TEXCOORD0;
                };
              
                struct v2f {
                    float4 pos : SV_POSITION;
                    float2 uv : TEXCOORD0;
                };
              
                uniform float4  _MainTex_ST;
                uniform float4 _Overlay_ST;
              
                v2f vert (appdata_tiny v)
                {
                    v2f o;
                    o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
                    o.uv = TRANSFORM_TEX(v.texcoord,_MainTex);
                    return o;
                }
              
                uniform float _Intensity;
                uniform sampler2D  _MainTex;
                uniform sampler2D  _Overlay;
              
                fixed4 frag (v2f i) : COLOR
                {
                    half4   tA = tex2D(_MainTex, i.uv);
                    half4   tB = tex2D(_Overlay, i.uv);
                   
                    fixed3 result  = lerp (tA.rgb, tB.rgb, tB.a * _Intensity);
                    //fixed3 result =  tB.rgb + (tA.rgb * (1 - tB.a));
                    //fixed3 r2 = (tB.rgb * tB.a) + (tA.rgb * (1 - tB.a));
                    return fixed4(result, 1);
                }
            ENDCG
        }
        }

Fallback off
   
}

And for those interested, attached is a simple scene with another shader variant and the whole problem exposed.

I have tried many alpha blending methods and each of them has the “alpha overriding” problem. As soon as something is transparent, everything behind it becomes transparent.

Does anybody know what the problem could be?

2088998–136611–WorldUiProblem.unitypackage (817 KB)

(After I try to give a potential solution like what your asking for I’ll give a suggestion on a different way of doing it :slight_smile:
A guess as to why this happens:
The alpha channels are blended the same way the colors are blended.
Try using this as blend mode for the GUI shaders. (Not NOT the merge shader!)
(I’m having some trouble opening your package, most likely due to being at 5.0.0 and not 5.0.1, but upgrading as I write.
Try using
Blend SrcAlpha OneMinusSrcAlpha, SrcAlpha One
as blend mode for transparent stuff.

Then my suggestion:
Use 2 cameras instead!
The second one you set to clear depth only.
And any post process you want done after both GUI AND normal stuff is done you put in there.
And I got your project to open, sadly I’m not sure if that blend mode fix I suggested can be used without overriding the sprite shader. Testing if I can do that now. But still, 2 cameras seems more reasonable.
Now I got that to work, it won’t look IDENTICAL because of how blending works. (Sequential blending is dependant on the undermost color, so my solution is in theory more correct.)
The problems with my solution:
With MSAA ON, unity has a tendancy to flip the screen updside down. So some post process you have is most likely missing the y flip define stuff that all post processes seem to need…

2089130–136619–Fixed.unitypackage (819 KB)

1 Like

Well I now feel like a total noob for not trying the two cameras setup in my test scene. That causes issues in my real project where I’m using HDR. Combining two cameras with image effects make the fog and a lot of other things disappear.

I think we can close this thread, I’ll pinpoint the HDR issue and will start a new one.

Thanks Zicandar :slight_smile:

There with the new thread: Blending two perspective cameras makes depth-based effects and particles disappear (in deferred). - Unity Engine - Unity Discussions