Transparent RenderTexture with PostProcessing

Yes, his screenshot does show it for some reason, but when you actually try it, it doesn’t show. Might be the version I’m using that’s causing it (2022.2.18)?

1 Like

I’m not on 2022 yet so I’m not sure if that is an issue but it’s not impossible that the UberShader has been updated. The key part of the ubershader we’ve updated is this (lines 238/239 in the code I posted before):

half alpha = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, uvDistorted).w;
            return half4(color, alpha);

So probably what you would like to do is just replace this bit in whatever the current version of the ubershader is… The important thing is that we’re asking the ubershader to pass the alpha value through the shader rather than just sending a hard coded ‘1’ for the alpha value.

1 Like

Thanks to @robrab2000-aa ! Worked for me in URP 14.0.8 using the following change at the end of the shader (seems to be changes to the UberPost.shader since URP 12):

half alpha = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, uvDistorted).w;
return half4(color, alpha);

Full Alpha UberPost.shader (URP 14.0.8)

Shader "Custom/AlphaUberPost"
{
    HLSLINCLUDE
        #pragma exclude_renderers gles
        #pragma multi_compile_local_fragment _ _DISTORTION
        #pragma multi_compile_local_fragment _ _CHROMATIC_ABERRATION
        #pragma multi_compile_local_fragment _ _BLOOM_LQ _BLOOM_HQ _BLOOM_LQ_DIRT _BLOOM_HQ_DIRT
        #pragma multi_compile_local_fragment _ _HDR_GRADING _TONEMAP_ACES _TONEMAP_NEUTRAL
        #pragma multi_compile_local_fragment _ _FILM_GRAIN
        #pragma multi_compile_local_fragment _ _DITHERING
        #pragma multi_compile_local_fragment _ _GAMMA_20 _LINEAR_TO_SRGB_CONVERSION
        #pragma multi_compile_local_fragment _ _USE_FAST_SRGB_LINEAR_CONVERSION
        #pragma multi_compile_fragment _ _FOVEATED_RENDERING_NON_UNIFORM_RASTER
        // Foveated rendering currently not supported in dxc on metal
        #pragma never_use_dxc metal
        #pragma multi_compile_fragment _ DEBUG_DISPLAY
        #pragma multi_compile_fragment _ SCREEN_COORD_OVERRIDE
        #pragma multi_compile_local_fragment _ HDR_INPUT HDR_ENCODING

        #ifdef HDR_ENCODING
        #define HDR_INPUT 1 // this should be defined when HDR_ENCODING is defined
        #endif

        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Filtering.hlsl"
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/ScreenCoordOverride.hlsl"
#if defined(HDR_ENCODING)
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Color.hlsl"
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/HDROutput.hlsl"
#endif

        #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
        #include "Packages/com.unity.render-pipelines.universal/Shaders/PostProcessing/Common.hlsl"
        #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Debug/DebuggingFullscreen.hlsl"
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/FoveatedRendering.hlsl"

        // Hardcoded dependencies to reduce the number of variants
        #if _BLOOM_LQ || _BLOOM_HQ || _BLOOM_LQ_DIRT || _BLOOM_HQ_DIRT
            #define BLOOM
            #if _BLOOM_LQ_DIRT || _BLOOM_HQ_DIRT
                #define BLOOM_DIRT
            #endif
        #endif

        TEXTURE2D_X(_Bloom_Texture);
        TEXTURE2D(_LensDirt_Texture);
        TEXTURE2D(_Grain_Texture);
        TEXTURE2D(_InternalLut);
        TEXTURE2D(_UserLut);
        TEXTURE2D(_BlueNoise_Texture);
        TEXTURE2D_X(_OverlayUITexture);

        float4 _Lut_Params;
        float4 _UserLut_Params;
        float4 _Bloom_Params;
        float _Bloom_RGBM;
        float4 _LensDirt_Params;
        float _LensDirt_Intensity;
        float4 _Distortion_Params1;
        float4 _Distortion_Params2;
        float _Chroma_Params;
        half4 _Vignette_Params1;
        float4 _Vignette_Params2;
    #ifdef USING_STEREO_MATRICES
        float4 _Vignette_ParamsXR;
    #endif
        float2 _Grain_Params;
        float4 _Grain_TilingParams;
        float4 _Bloom_Texture_TexelSize;
        float4 _Dithering_Params;
        float4 _HDROutputLuminanceParams;

        #define DistCenter              _Distortion_Params1.xy
        #define DistAxis                _Distortion_Params1.zw
        #define DistTheta               _Distortion_Params2.x
        #define DistSigma               _Distortion_Params2.y
        #define DistScale               _Distortion_Params2.z
        #define DistIntensity           _Distortion_Params2.w

        #define ChromaAmount            _Chroma_Params.x

        #define BloomIntensity          _Bloom_Params.x
        #define BloomTint               _Bloom_Params.yzw
        #define BloomRGBM               _Bloom_RGBM.x
        #define LensDirtScale           _LensDirt_Params.xy
        #define LensDirtOffset          _LensDirt_Params.zw
        #define LensDirtIntensity       _LensDirt_Intensity.x

        #define VignetteColor           _Vignette_Params1.xyz
    #ifdef USING_STEREO_MATRICES
        #define VignetteCenterEye0      _Vignette_ParamsXR.xy
        #define VignetteCenterEye1      _Vignette_ParamsXR.zw
    #else
        #define VignetteCenter          _Vignette_Params2.xy
    #endif
        #define VignetteIntensity       _Vignette_Params2.z
        #define VignetteSmoothness      _Vignette_Params2.w
        #define VignetteRoundness       _Vignette_Params1.w

        #define LutParams               _Lut_Params.xyz
        #define PostExposure            _Lut_Params.w
        #define UserLutParams           _UserLut_Params.xyz
        #define UserLutContribution     _UserLut_Params.w

        #define GrainIntensity          _Grain_Params.x
        #define GrainResponse           _Grain_Params.y
        #define GrainScale              _Grain_TilingParams.xy
        #define GrainOffset             _Grain_TilingParams.zw

        #define DitheringScale          _Dithering_Params.xy
        #define DitheringOffset         _Dithering_Params.zw

        #define MinNits                 _HDROutputLuminanceParams.x
        #define MaxNits                 _HDROutputLuminanceParams.y
        #define PaperWhite              _HDROutputLuminanceParams.z
        #define OneOverPaperWhite       _HDROutputLuminanceParams.w

        float2 DistortUV(float2 uv)
        {
            // Note: this variant should never be set with XR
            #if _DISTORTION
            {
                uv = (uv - 0.5) * DistScale + 0.5;
                float2 ruv = DistAxis * (uv - 0.5 - DistCenter);
                float ru = length(float2(ruv));

                UNITY_BRANCH
                if (DistIntensity > 0.0)
                {
                    float wu = ru * DistTheta;
                    ru = tan(wu) * (rcp(ru * DistSigma));
                    uv = uv + ruv * (ru - 1.0);
                }
                else
                {
                    ru = rcp(ru) * DistTheta * atan(ru * DistSigma);
                    uv = uv + ruv * (ru - 1.0);
                }
            }
            #endif

            return uv;
        }

        half4 FragUberPost(Varyings input) : SV_Target
        {
            UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);

            float2 uv = SCREEN_COORD_APPLY_SCALEBIAS(UnityStereoTransformScreenSpaceTex(input.texcoord));
            float2 uvDistorted = DistortUV(uv);

            half3 color = (0.0).xxx;

            #if _CHROMATIC_ABERRATION
            {
                // Very fast version of chromatic aberration from HDRP using 3 samples and hardcoded
                // spectral lut. Performs significantly better on lower end GPUs.
                float2 coords = 2.0 * uv - 1.0;
                float2 end = uv - coords * dot(coords, coords) * ChromaAmount;
                float2 delta = (end - uv) / 3.0;

                half r = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, SCREEN_COORD_REMOVE_SCALEBIAS(uvDistorted)                ).x;
                half g = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, SCREEN_COORD_REMOVE_SCALEBIAS(DistortUV(delta + uv)      )).y;
                half b = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, SCREEN_COORD_REMOVE_SCALEBIAS(DistortUV(delta * 2.0 + uv))).z;

                color = half3(r, g, b);
            }
            #else
            {
                color = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, SCREEN_COORD_REMOVE_SCALEBIAS(uvDistorted)).xyz;
            }
            #endif

            // Gamma space... Just do the rest of Uber in linear and convert back to sRGB at the end
            #if UNITY_COLORSPACE_GAMMA
            {
                color = GetSRGBToLinear(color);
            }
            #endif

            #if defined(BLOOM)
            {
                float2 uvBloom = uvDistorted;
                #if defined(_FOVEATED_RENDERING_NON_UNIFORM_RASTER)
                    uvBloom = RemapFoveatedRenderingDistort(uvBloom);
                #endif

                #if _BLOOM_HQ && !defined(SHADER_API_GLES)
                half4 bloom = SampleTexture2DBicubic(TEXTURE2D_X_ARGS(_Bloom_Texture, sampler_LinearClamp), SCREEN_COORD_REMOVE_SCALEBIAS(uvBloom), _Bloom_Texture_TexelSize.zwxy, (1.0).xx, unity_StereoEyeIndex);
                #else
                half4 bloom = SAMPLE_TEXTURE2D_X(_Bloom_Texture, sampler_LinearClamp, SCREEN_COORD_REMOVE_SCALEBIAS(uvBloom));
                #endif

                #if UNITY_COLORSPACE_GAMMA
                bloom.xyz *= bloom.xyz; // γ to linear
                #endif

                UNITY_BRANCH
                if (BloomRGBM > 0)
                {
                    bloom.xyz = DecodeRGBM(bloom);
                }

                bloom.xyz *= BloomIntensity;
                color += bloom.xyz * BloomTint;

                #if defined(BLOOM_DIRT)
                {
                    // UVs for the dirt texture should be DistortUV(uv * DirtScale + DirtOffset) but
                    // considering we use a cover-style scale on the dirt texture the difference
                    // isn't massive so we chose to save a few ALUs here instead in case lens
                    // distortion is active.
                    half3 dirt = SAMPLE_TEXTURE2D(_LensDirt_Texture, sampler_LinearClamp, uvDistorted * LensDirtScale + LensDirtOffset).xyz;
                    dirt *= LensDirtIntensity;
                    color += dirt * bloom.xyz;
                }
                #endif
            }
            #endif

            // To save on variants we'll use an uniform branch for vignette. Lower end platforms
            // don't like these but if we're running Uber it means we're running more expensive
            // effects anyway. Lower-end devices would limit themselves to on-tile compatible effect
            // and thus this shouldn't too much of a problem (famous last words).
            UNITY_BRANCH
            if (VignetteIntensity > 0)
            {
            #ifdef USING_STEREO_MATRICES
                // With XR, the views can use asymmetric FOV which will have the center of each
                // view be at a different location.
                const float2 VignetteCenter = unity_StereoEyeIndex == 0 ? VignetteCenterEye0 : VignetteCenterEye1;
            #endif

                color = ApplyVignette(color, uvDistorted, VignetteCenter, VignetteIntensity, VignetteRoundness, VignetteSmoothness, VignetteColor);
            }

            // Color grading is always enabled when post-processing/uber is active
            {
                color = ApplyColorGrading(color, PostExposure, TEXTURE2D_ARGS(_InternalLut, sampler_LinearClamp), LutParams, TEXTURE2D_ARGS(_UserLut, sampler_LinearClamp), UserLutParams, UserLutContribution);
            }

            #if _FILM_GRAIN
            {
                color = ApplyGrain(color, uv, TEXTURE2D_ARGS(_Grain_Texture, sampler_LinearRepeat), GrainIntensity, GrainResponse, GrainScale, GrainOffset, OneOverPaperWhite);
            }
            #endif

            // When Unity is configured to use gamma color encoding, we ignore the request to convert to gamma 2.0 and instead fall back to sRGB encoding
            #if _GAMMA_20 && !UNITY_COLORSPACE_GAMMA
            {
                color = LinearToGamma20(color);
            }
            // Back to sRGB
            #elif UNITY_COLORSPACE_GAMMA || _LINEAR_TO_SRGB_CONVERSION
            {
                color = GetLinearToSRGB(color);
            }
            #endif

            #if _DITHERING
            {
                color = ApplyDithering(color, uv, TEXTURE2D_ARGS(_BlueNoise_Texture, sampler_PointRepeat), DitheringScale, DitheringOffset, PaperWhite, OneOverPaperWhite);
                // Assume color > 0 and prevent 0 - ditherNoise.
                // Negative colors can cause problems if fed back to the postprocess via render to FP16 texture.
                color = max(color, 0);
            }
            #endif

            #ifdef HDR_ENCODING
            {
                float4 uiSample = SAMPLE_TEXTURE2D_X(_OverlayUITexture, sampler_PointClamp, input.texcoord);
                color.rgb = SceneUIComposition(uiSample, color.rgb, PaperWhite, MaxNits);
                color.rgb = OETF(color.rgb);
            }
            #endif

            #if defined(DEBUG_DISPLAY)
            half4 debugColor = 0;

            if(CanDebugOverrideOutputColor(half4(color, 1), uv, debugColor))
            {
                return debugColor;
            }
            #endif

            half alpha = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, uvDistorted).w;
            return half4(color, alpha);
        }
    ENDHLSL

    SubShader
    {
        Tags
        {
            "RenderType" = "Opaque" "RenderPipeline" = "UniversalPipeline"
        }
        LOD 100
        ZTest Always ZWrite Off Cull Off

        Pass
        {
            Name "UberPost"

            HLSLPROGRAM
                #pragma vertex Vert
                #pragma fragment FragUberPost
            ENDHLSL
        }
    }
}
3 Likes

Not sure if Im doing something wrong, but any changes made to the Forward Renderer shaders either break post processing completely, or it gets reverted to the original UberPost shader with the error:
“The package cache was invalidated and rebuilt because the following immutable asset(s) were unexpectedly altered:
Packages/com.unity.render-pipelines.universal/Runtime/Data/PostProcessData.asset”. Anybody found a fix for this??

Have you tried turning URP into a local package in your asset folder? if you don’t then your changes will get overwritten by the package system

I dont really know how to do that, but I did everything else just like you did. Its very strange, sometimes it doesnt revert but still has no effect when I change the alpha value in script. Have you changed anything else other than “return half4(color, alpha);” in UberPost shader?

have you assigned it like this: [screenshot-2022-07-15-at-10-43-30-png.1085277

A](https://forum.unity.com/attachments/screenshot-2022-07-15-at-10-43-30-png.1085277/)lso make sure you are working from a duplicate shader, not the original or it will get overwritten every time: “I’ve done this by duplicating the UberPost.shader file into my project, referencing the duplicated shader in my Forward Renderer Data object and changing the return of this function to this:
return half4(color, 0.25);”

Yep I did it all right, even moved the UniversalRP folder out of PackageCache so it wouldnt get overwritten but it still has no effect. Only thing that does have any effect is changing the alpha value in the FinalPost shader, but it makes the whole image transparent and not just the background.
9159797--1274090--head1.png
On the left there is no post processing, on the right its on. The camera for the render texture is set to solid color and alpha is at 0. Everything else I did just like you said. Could it be a version problem? I am using 2021.3.22.

Hmm, I don’t think its a version thing as that’s the same version we’re on…

I am only guessing: Does the shader, which is rendering the post processed texture having the alpha values into camera view, uses the correct alpha blending modes, so transparency is applied? When yes, double check that the post processed texture is really having the alpha values. When both is true, it should be rendered fine.

From this post I’ve found the feature request for preserving alpha state when using postprocess in URP.

Let’s please vote there so they consider fixing it faster.

2 Likes

Thank for your piece of code! Unfortunately, it seems not working in my case, since this addition of alpha reduced bloom effect. I was thinking, we’d better use alpha channel along all transformations, this way the actual bloom effect can work, inasmuch is its alpha will be transformed as well.

But I’m not sure how to edit shader code in order to force it work properly.

1 Like

I have implemented a shader to blit the post-processed texture respecting the alpha values especially for bloom. I have only limited knowledge about HLSL so it requires two passes for appropriate blending (cost more performance than one pass). The shader works in conjunction with the customized UberPost.shader at least I tested it in URP 14. Here is the shader code:
Shader

Shader "Custom/SH_EffectsRenderPass_Default"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
    }
    SubShader
    {
        Tags { "Queue"="Transparent" "RenderType"="Transparent" }
        LOD 100

        Pass
        {
            Blend SrcAlpha OneMinusSrcAlpha
            ZWrite Off
            Cull Back
           
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                return o;
            }

            half4 frag (v2f i) : SV_Target
            {
                half4 col = tex2D(_MainTex, i.uv);
                return col;    
            }           
            ENDCG
        }
       
        Pass
        {
            Blend SrcAlpha One
            ZWrite Off
            Cull Back
           
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            v2f vert (appdata v)
            {
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                return o;
            }

            half4 frag (v2f i) : SV_Target
            {
                half4 col = tex2D(_MainTex, i.uv);
                col.a = col.a < 1 ? 1.0 : 0.0;
               
                return col;        
            }           
            ENDCG
        }
    }
}

When blitting the post-processed texture you have to use the shader/material of the shader for the operation.

1 Like

Hi,
I encountered a problem while attempting to make the RenderTexture transparent. Despite following the steps provided, I still couldn’t achieve the desired result. Here’s a summary of the steps I took:

  1. Copied the UberPost shader and added two lines of code to it.
    UberPost_Alpha
Shader "Custom/Universal Render Pipeline/UberPost_Alpha"
{
    HLSLINCLUDE
        #pragma exclude_renderers gles
        #pragma multi_compile_local_fragment _ _DISTORTION
        #pragma multi_compile_local_fragment _ _CHROMATIC_ABERRATION
        #pragma multi_compile_local_fragment _ _BLOOM_LQ _BLOOM_HQ _BLOOM_LQ_DIRT _BLOOM_HQ_DIRT
        #pragma multi_compile_local_fragment _ _HDR_GRADING _TONEMAP_ACES _TONEMAP_NEUTRAL
        #pragma multi_compile_local_fragment _ _FILM_GRAIN
        #pragma multi_compile_local_fragment _ _DITHERING
        #pragma multi_compile_local_fragment _ _GAMMA_20 _LINEAR_TO_SRGB_CONVERSION
        #pragma multi_compile_local_fragment _ _USE_FAST_SRGB_LINEAR_CONVERSION
        #pragma multi_compile _ _USE_DRAW_PROCEDURAL
        #pragma multi_compile_fragment _ DEBUG_DISPLAY

        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Filtering.hlsl"
        #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
        #include "Packages/com.unity.render-pipelines.universal/Shaders/PostProcessing/Common.hlsl"
        #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Debug/DebuggingFullscreen.hlsl"

        // Hardcoded dependencies to reduce the number of variants
        #if _BLOOM_LQ || _BLOOM_HQ || _BLOOM_LQ_DIRT || _BLOOM_HQ_DIRT
            #define BLOOM
            #if _BLOOM_LQ_DIRT || _BLOOM_HQ_DIRT
                #define BLOOM_DIRT
            #endif
        #endif

        TEXTURE2D_X(_SourceTex);
        TEXTURE2D_X(_Bloom_Texture);
        TEXTURE2D(_LensDirt_Texture);
        TEXTURE2D(_Grain_Texture);
        TEXTURE2D(_InternalLut);
        TEXTURE2D(_UserLut);
        TEXTURE2D(_BlueNoise_Texture);

        float4 _Lut_Params;
        float4 _UserLut_Params;
        float4 _Bloom_Params;
        float _Bloom_RGBM;
        float4 _LensDirt_Params;
        float _LensDirt_Intensity;
        float4 _Distortion_Params1;
        float4 _Distortion_Params2;
        float _Chroma_Params;
        half4 _Vignette_Params1;
        float4 _Vignette_Params2;
        float2 _Grain_Params;
        float4 _Grain_TilingParams;
        float4 _Bloom_Texture_TexelSize;
        float4 _Dithering_Params;

        #define DistCenter              _Distortion_Params1.xy
        #define DistAxis                _Distortion_Params1.zw
        #define DistTheta               _Distortion_Params2.x
        #define DistSigma               _Distortion_Params2.y
        #define DistScale               _Distortion_Params2.z
        #define DistIntensity           _Distortion_Params2.w

        #define ChromaAmount            _Chroma_Params.x

        #define BloomIntensity          _Bloom_Params.x
        #define BloomTint               _Bloom_Params.yzw
        #define BloomRGBM               _Bloom_RGBM.x
        #define LensDirtScale           _LensDirt_Params.xy
        #define LensDirtOffset          _LensDirt_Params.zw
        #define LensDirtIntensity       _LensDirt_Intensity.x

        #define VignetteColor           _Vignette_Params1.xyz
        #define VignetteCenter          _Vignette_Params2.xy
        #define VignetteIntensity       _Vignette_Params2.z
        #define VignetteSmoothness      _Vignette_Params2.w
        #define VignetteRoundness       _Vignette_Params1.w

        #define LutParams               _Lut_Params.xyz
        #define PostExposure            _Lut_Params.w
        #define UserLutParams           _UserLut_Params.xyz
        #define UserLutContribution     _UserLut_Params.w

        #define GrainIntensity          _Grain_Params.x
        #define GrainResponse           _Grain_Params.y
        #define GrainScale              _Grain_TilingParams.xy
        #define GrainOffset             _Grain_TilingParams.zw

        #define DitheringScale          _Dithering_Params.xy
        #define DitheringOffset         _Dithering_Params.zw

        float2 DistortUV(float2 uv)
        {
            // Note: this variant should never be set with XR
            #if _DISTORTION
            {
                uv = (uv - 0.5) * DistScale + 0.5;
                float2 ruv = DistAxis * (uv - 0.5 - DistCenter);
                float ru = length(float2(ruv));

                UNITY_BRANCH
                if (DistIntensity > 0.0)
                {
                    float wu = ru * DistTheta;
                    ru = tan(wu) * (rcp(ru * DistSigma));
                    uv = uv + ruv * (ru - 1.0);
                }
                else
                {
                    ru = rcp(ru) * DistTheta * atan(ru * DistSigma);
                    uv = uv + ruv * (ru - 1.0);
                }
            }
            #endif

            return uv;
        }

        half4 Frag(Varyings input) : SV_Target
        {
            UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);

            float2 uv = UnityStereoTransformScreenSpaceTex(input.uv);
            float2 uvDistorted = DistortUV(uv);

            half3 color = (0.0).xxx;

            #if _CHROMATIC_ABERRATION
            {
                // Very fast version of chromatic aberration from HDRP using 3 samples and hardcoded
                // spectral lut. Performs significantly better on lower end GPUs.
                float2 coords = 2.0 * uv - 1.0;
                float2 end = uv - coords * dot(coords, coords) * ChromaAmount;
                float2 delta = (end - uv) / 3.0;

                half r = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, uvDistorted                ).x;
                half g = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, DistortUV(delta + uv)      ).y;
                half b = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, DistortUV(delta * 2.0 + uv)).z;

                color = half3(r, g, b);
            }
            #else
            {
                color = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, uvDistorted).xyz;
            }
            #endif

            // Gamma space... Just do the rest of Uber in linear and convert back to sRGB at the end
            #if UNITY_COLORSPACE_GAMMA
            {
                color = GetSRGBToLinear(color);
            }
            #endif

            #if defined(BLOOM)
            {
                #if _BLOOM_HQ && !defined(SHADER_API_GLES)
                half4 bloom = SampleTexture2DBicubic(TEXTURE2D_X_ARGS(_Bloom_Texture, sampler_LinearClamp), uvDistorted, _Bloom_Texture_TexelSize.zwxy, (1.0).xx, unity_StereoEyeIndex);
                #else
                half4 bloom = SAMPLE_TEXTURE2D_X(_Bloom_Texture, sampler_LinearClamp, uvDistorted);
                #endif

                #if UNITY_COLORSPACE_GAMMA
                bloom.xyz *= bloom.xyz; // γ to linear
                #endif

                UNITY_BRANCH
                if (BloomRGBM > 0)
                {
                    bloom.xyz = DecodeRGBM(bloom);
                }

                bloom.xyz *= BloomIntensity;
                color += bloom.xyz * BloomTint;

                #if defined(BLOOM_DIRT)
                {
                    // UVs for the dirt texture should be DistortUV(uv * DirtScale + DirtOffset) but
                    // considering we use a cover-style scale on the dirt texture the difference
                    // isn't massive so we chose to save a few ALUs here instead in case lens
                    // distortion is active.
                    half3 dirt = SAMPLE_TEXTURE2D(_LensDirt_Texture, sampler_LinearClamp, uvDistorted * LensDirtScale + LensDirtOffset).xyz;
                    dirt *= LensDirtIntensity;
                    color += dirt * bloom.xyz;
                }
                #endif
            }
            #endif

            // To save on variants we'll use an uniform branch for vignette. Lower end platforms
            // don't like these but if we're running Uber it means we're running more expensive
            // effects anyway. Lower-end devices would limit themselves to on-tile compatible effect
            // and thus this shouldn't too much of a problem (famous last words).
            UNITY_BRANCH
            if (VignetteIntensity > 0)
            {
                color = ApplyVignette(color, uvDistorted, VignetteCenter, VignetteIntensity, VignetteRoundness, VignetteSmoothness, VignetteColor);
            }

            // Color grading is always enabled when post-processing/uber is active
            {
                color = ApplyColorGrading(color, PostExposure, TEXTURE2D_ARGS(_InternalLut, sampler_LinearClamp), LutParams, TEXTURE2D_ARGS(_UserLut, sampler_LinearClamp), UserLutParams, UserLutContribution);
            }

            #if _FILM_GRAIN
            {
                color = ApplyGrain(color, uv, TEXTURE2D_ARGS(_Grain_Texture, sampler_LinearRepeat), GrainIntensity, GrainResponse, GrainScale, GrainOffset);
            }
            #endif

            // When Unity is configured to use gamma color encoding, we ignore the request to convert to gamma 2.0 and instead fall back to sRGB encoding
            #if _GAMMA_20 && !UNITY_COLORSPACE_GAMMA
            {
                color = LinearToGamma20(color);
            }
            // Back to sRGB
            #elif UNITY_COLORSPACE_GAMMA || _LINEAR_TO_SRGB_CONVERSION
            {
                color = GetLinearToSRGB(color);
            }
            #endif

            #if _DITHERING
            {
                color = ApplyDithering(color, uv, TEXTURE2D_ARGS(_BlueNoise_Texture, sampler_PointRepeat), DitheringScale, DitheringOffset);
                // Assume color > 0 and prevent 0 - ditherNoise.
                // Negative colors can cause problems if fed back to the postprocess via render to FP16 texture.
                color = max(color, 0);
            }
            #endif

            #if defined(DEBUG_DISPLAY)
            half4 debugColor = 0;

            if(CanDebugOverrideOutputColor(half4(color, 1), uv, debugColor))
            {
                return debugColor;
            }
            #endif

            half alpha = SAMPLE_TEXTURE2D_X(_SourceTex, sampler_LinearClamp, uvDistorted).w;
            return half4(color, alpha);
        }

    ENDHLSL

    SubShader
    {
        Tags { "RenderType" = "Opaque" "RenderPipeline" = "UniversalPipeline"}
        LOD 100
        ZTest Always ZWrite Off Cull Off

        Pass
        {
            Name "UberPost"

            HLSLPROGRAM
                #pragma vertex FullscreenVert
                #pragma fragment Frag
            ENDHLSL
        }
    }
}
  1. Changed UberPost to the alpha version by customizing the PostProcessData and setting it to the URP Asset > Universal Renderer Data.

To give you an idea, my camera is set to Solid Color with an alpha value of 0, and the RenderTexture is R8G8B8A8_UNORM + D24_UNORM (the only one that works on my target machine, unfortunately).

However, despite following these steps, the RenderTexture is still not transparent as expected. Am I missing something crucial here?
If you’ve got any ideas or solutions, I’d be more than grateful. Thanks!

Yes, you should use the clear color (https://docs.unity3d.com/ScriptReference/Color-clear.html) with BackgroundType “Solid Color” on the camera rendering the post-processing effects and using the customized UberPost.shader. Especially ensure the customized UberPost.shader is used via the assigned Renderer of the camera:
9188264--1280453--upload_2023-8-2_11-13-53.png
In the settings of the Renderer ensure your cusomized Post Process Data is used which contains the cusomized UberPost.shader:
9188264--1280459--upload_2023-8-2_11-15-47.png
When checking the preview of the RenderTexture of the effects camera then you always see black pixels instead of transparent ones at least in Unity 2022.3 and URP 14, so it looks something like this:
9188264--1280462--upload_2023-8-2_11-20-19.png
Try exporting the RenderTexture to PNG to see the transparency. Here is a code snipped for an export function of a RenderTexture to PNG:
ExportToPng

#nullable enable

using System.IO;

using UnityEditor;

using UnityEngine;

using File = UnityEngine.Windows.File;

namespace My.Common.Scripts.Editor
{
  /// <summary>
  /// Class for exporting PNG from textures.
  /// </summary>
  public static class TexturePngExport
  {
    /// <summary>
    /// Exports the selected <see cref="RenderTexture" /> to PNG file.
    /// </summary>
    [MenuItem("Tools/Export/Texture To PNG")]
    public static void ExportToPng()
    {
      if (Selection.activeObject is RenderTexture renderTexture)
      {
        var assetPath = AssetDatabase.GetAssetPath(renderTexture);
        var fullPath = Path.Combine(Application.dataPath, "..", assetPath);
        fullPath = Path.ChangeExtension(Path.GetFullPath(fullPath), "png");

        if (!File.Exists(fullPath) || !EditorUtility.DisplayDialog("Confirmation", $"File for export '{fullPath}' already exists. Continue overriding?", "No", "Yes"))
        {
          var activeRenderTexture = RenderTexture.active;
          RenderTexture.active = renderTexture;
          var texture2D = new Texture2D(renderTexture.width, renderTexture.height, TextureFormat.RGBA32, false);
          texture2D.ReadPixels(new Rect(0, 0, renderTexture.width, renderTexture.height), 0, 0);
          texture2D.Apply(); // This line is necessary to make the pixel changes take effect
          RenderTexture.active = activeRenderTexture;

          var bytes = texture2D.EncodeToPNG();
          File.WriteAllBytes(fullPath, bytes);

          // Refresh AssetDatabase
          AssetDatabase.Refresh();

          Debug.Log($"Exported render texture as PNG using path '{fullPath}'");
        }
        else
        {
          Debug.Log("Use cancelled exported of the render texture as PNG");
        }
      }
      else
      {
        Debug.LogError("Not a render texture selected");
      }
    }
  }
}

Select the RenderTexture in the Project Hierarchy of the Unity Editor and use Menu → Tools → Export → Export To PNG. Then a PNG file is created. Thats how it looks then, the transparency can be seen when all steps have been done correctly:
9188264--1280465--upload_2023-8-2_11-28-7.png

I also tested with the graphics format you mentioned and it works also like the settings I used. Be aware that your graphics format settings causing restrictions when using the bloom post-processing effect, because you don’t have a HDR color range on the RenderTexture then.

@FaithlessOne Thank you very much for your response!

Despite making the changes as per your advice, I am still facing difficulties in generating a transparent render texture.
I have attached the settings I used for your reference.

The exported png file:

However, the exported PNG file retains a red background, which is not the desired outcome.

I have been working on this issue for two days without any progress.
I would greatly appreciate any further assistance or possible solutions you may have.
Thank you in advance for your help!

@TCYeh
I tested using your camera settings and I can reproduce your result, the RenderTexture does not contain the alpha values anymore. The reason is the usage of Anti-Aliasing. The post-processing AA techniques including FXAA, SMAA and TAA, but not MSAA (which is geometry based and applied earlier) are applied after other post-processing effects. The nature of these techniques is working on opaque textures and not on textures having transparency, so it is not surprising that they do not maintain alpha values in general. Interesstingly in URP 14 and on Windows platform SMAA seems to respect the alpha values, but this is far beyond my knowledge why it is like that and also if its like that on other platforms. So a safe solution would be not using AA post-processing techniques or using your own customized AA shader which respects transparency.

2 Likes

Would anyone know how to fix the issue of transparent materials or opaque ones without alpha when rendering to a texture? I’m having the result being see through while ideally if an effect has transparency it should be applied to the previous color while keeping the alpha value intact, and same for opaques that should output alpha 1 but they output 0.

If I return this from the UberPost

half4 alpha = SAMPLE_TEXTURE2D_X(_BlitTexture, sampler_LinearClamp, uvDistorted).w;

I get the this result where anything transparent or opaque with alpha 0 is see through.

If I then add

alpha = alpha > 0.0 ? 1.0 : 0.0;

I get the this result but that does not work on the edges where the transparency between 0 and 1 is needed.

I understand that it could be due to each shader not doing the right thing but maybe there’s something that could be patched to make this work regardless in UberPost or other shaders there, like the way each Blit is blended?

I had to disable Motion Blur in PostProcessing to get the desired result in the Editor while not in Play Mode. It seems to be not compliant with transparency for render textures.

Nevertheless: As soon as I start the game, the problem persists. Followed all advice here…

I made a 5min tutorial on Youtube for everyone looking on how to solve this:

6 Likes