Introduction of Render Graph in the Universal Render Pipeline (URP)

Hello Unity community,

We are ready to share the new RenderGraph based version of URP with you! You might have seen it on our roadmap over the last year, and many PRs landing into the Graphics repo. We are close to shipping it and you can now start to try it out using the latest 23.3 alpha release! You can expect some changes during the alpha, especially based on your feedback. Currently, it is still hidden behind a scripting define (see the documentation below) to try it out.

With this post, we aim to share our work early and discuss with you the next steps. So let us know what you think!

Why RenderGraph?
Render Graph is a foundational system that automatically optimizes runtime resources for rendering. This simplifies the development of render features in our render pipelines while improving performance over a wide range of potential pipeline configurations. This also reduces the likelihood of bugs when features are manually optimized.

URP is highly extensible and with RenderGraph, performance can now be automatically optimized for ScriptableRenderPasses that are added to your project. This will lead to better GPU performance when you are extending URP. As part of this project, we’ve improved RenderGraph to apply the NativeRenderPass API that optimizes GPU bandwidth on tile-based (mobile) GPUs.

The benefits for you are:

  • Enhanced Extensibility and Customization: The Render Graph API allows you to access more frame resources in your custom passes and share data between passes. For example you can now get access to the G-buffer for your effects.

  • Stricter and Safer API: The new APIs support you to ensure your Renderer Features/Custom Passes are both robust on many platforms and optimized automatically. This prevents you from making mistakes that would lead to rendering issues or performance problems

  • Optimized GPU Performance: While this release is about the foundation and we have more potential to improve performance even further in future releases, current results show an average of 1ms improvement in GPU performance per frame, significantly reducing bandwidth waste and enhancing both device thermal states and battery life. You can now customize URP yourself more easily to get more performance out of it.

What Changes?
All URP features have been converted to using RenderGraph under the hood. Apart from a slight difference in performance, nothing changes in your project if you haven’t extended URP.

The main difference is your access to RenderGraph in the modified *ScriptableRenderPass *class. This allows you to benefit from the automatic performance optimization that RenderGraph offers when extending URP. However, this new API is tightly coupled to the new foundation so you’ll need to upgrade your RenderFeatures and ScriptableRenderPass classes. The previous API will not work with RenderGraph.

You can find details how to start here:

  • Render Graph documentation
  • Code Samples can be found in the Package Manager samples (see reply #224)
  • There is a new Custom Post-Processing template that you can access in the assets window through Create > Rendering > URP Post-processing Effect (Renderer Feature with Volume) that highlights how to support RG and Non-RG at the same time (see reply #191)

Love to hear from you!
You can test RenderGraph in the 2023.3 alpha release (Unity 2023.3.0a18 or later, see the documentation above) and see how it works. We’d love to hear from you how it can benefit your project.

We encourage thoughts, questions, and constructive feedback as we progress towards the final stages of this feature. Your input is vital to us!

Stay informed of upcoming details, updates, and insights related to this feature.

The render pipeline team


Updates:
09-Oct 2023: Edited min version number to 2023.3.0a8, since this reflects changes shown in the alpha documentation. Added a link to the “Perform a full screen blit in URP” file to the documentation.

15-Dec 2023: Updated documentation reflecting changes in 2023.3.0a18

  • For new projects, Render Graph in URP is now enabled by default in Unity 2023.3.0a18 and later.
  • Added Compatibility Mode (RenderGraph disabled)
  • New API to Set Global Textures
  • Renamed API UseTextureFragment to SetRenderAttachment in the RenderGraphBuilder
  • Introduction of Unsafe Passes, Updates on the Render Graph Viewer for Debugging
19 Likes

I’m excited for the RenderGraph changes but also nervous of the work involved. Are there any examples/tutorials that show how to modify an existing simple render feature, like a full screen blit? Also, do you have a very rough eta of when this feature will be ‘on by default’ so that we can plan support for it?

Cheers,
Elliot

We are working on adding simple code examples for different common use case scenarios. For a simple Blit this is a simple example render feature:

using UnityEngine;
using UnityEngine.Experimental.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

public class CopyRenderFeature : ScriptableRendererFeature
{
    class CopyRenderPass : ScriptableRenderPass
    {
        // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
        private class PassData
        {
            internal TextureHandle src;
        }

        // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
        static void ExecutePass(PassData data, RasterGraphContext context)
        {
            Blitter.BlitTexture(context.cmd, data.src, new Vector4(1,1,0,0), 0, false);
        }
      
        // This is where the renderGraph handle can be accessed.
        // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
        public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
        {
            string passName = "Copy To Debug Texture";
          
            // This simple pass copies the active color texture to a new texture. This sample is for API demonstrative purposes,
            // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.

            // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
            using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
            {
                // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
                // The active color and depth textures are the main color and depth buffers that the camera renders into
                UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
              
                // Fill up the passData with the data needed by the pass
              
                // Get the active color texture through the frame data, and set it as the source texture for the blit
                passData.src = resourceData.activeColorTexture;
              
                // The destination texture is created here,
                // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
                // we also disable MSAA as we don't need multisampled textures for this sample
              
                UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
                RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
                desc.msaaSamples = 1;
                desc.depthBufferBits = 0;
              
                TextureHandle destination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "CopyTexture", false);
              
                // We declare the src texture as an input dependency to this pass, via UseTexture()
                builder.UseTexture(passData.src);

                // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
                builder.UseTextureFragment(destination, 0);
              
                // We disable culling for this pass for the demonstrative purpose of this sampe, as normally this pass would be culled,
                // since the destination texture is not used anywhere else
                builder.AllowPassCulling(false);

                // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
            }
        }
    }

    CopyRenderPass m_CopyRenderPass;

    /// <inheritdoc/>
    public override void Create()
    {
        m_CopyRenderPass = new CopyRenderPass();

        // Configures where the render pass should be injected.
        m_CopyRenderPass.renderPassEvent = RenderPassEvent.AfterRenderingOpaques;
    }

    // Here you can inject one or multiple render passes in the renderer.
    // This method is called when setting up the renderer once per-camera.
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        renderer.EnqueuePass(m_CopyRenderPass);
    }
}
7 Likes

And this one is a Blit pass using a custom material/shader:

Render Feature:

using UnityEngine;
using UnityEngine.Experimental.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
using UnityEngine.Serialization;

public class BlitWithMaterialRenderFeature : ScriptableRendererFeature
{
    class BlitWithMaterialPass : ScriptableRenderPass
    {
        private Material m_BlitMaterial;
        
        public BlitWithMaterialPass(Material blitMaterial)
        {
            m_BlitMaterial = blitMaterial;
        }
        
        // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
        private class PassData
        {
            internal TextureHandle src;
            internal TextureHandle dst;
            internal Material blitMaterial;
        }

        // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
        static void ExecutePass(PassData data, RasterGraphContext context)
        {
            Blitter.BlitTexture(context.cmd, data.src, new Vector4(1, 1, 0, 0), data.blitMaterial, 0);
        }

        private void InitPassData(RenderGraph renderGraph, ContextContainer frameData, ref PassData passData)
        {
            // Fill up the passData with the data needed by the passes
            
            // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
            // The active color and depth textures are the main color and depth buffers that the camera renders into
            UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
            
            // The destination texture is created here, 
            // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
            // we also disable MSAA as we don't need multisampled textures for this sample
                
            UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
            RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
            desc.msaaSamples = 1;
            desc.depthBufferBits = 0;
                
            TextureHandle destination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "BlitMaterialTexture", false);
            
            passData.src = resourceData.activeColorTexture;
            passData.dst = destination;
            passData.blitMaterial = m_BlitMaterial;
        }
        
        // This is where the renderGraph handle can be accessed.
        // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
        public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
        {
            string passName = "Blit With Material";
            
            // This simple pass copies the active color texture to a new texture using a custom material. This sample is for API demonstrative purposes,
            // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.

            // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
            using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
            {
                // Initialize the pass data
                InitPassData(renderGraph, frameData, ref passData);

                // We declare the src texture as an input dependency to this pass, via UseTexture()
                builder.UseTexture(passData.src);

                // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
                builder.UseTextureFragment(passData.dst, 0);
                
                // We disable culling for this pass for the demonstrative purpose of this sampe, as normally this pass would be culled,
                // since the destination texture is not used anywhere else
                builder.AllowPassCulling(false);

                // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
            }
        }
    }

    BlitWithMaterialPass m_BlitWithMaterialPass;
    
    public Material m_BlitColorMaterial;

    /// <inheritdoc/>
    public override void Create()
    {
        m_BlitWithMaterialPass = new BlitWithMaterialPass(m_BlitColorMaterial);

        // Configures where the render pass should be injected.
        m_BlitWithMaterialPass.renderPassEvent = RenderPassEvent.BeforeRenderingTransparents;
    }

    // Here you can inject one or multiple render passes in the renderer.
    // This method is called when setting up the renderer once per-camera.
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        renderer.EnqueuePass(m_BlitWithMaterialPass);
    }
}

Shader:

Shader "BlitWithMaterial"
{
   SubShader
   {
       Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
       ZWrite Off Cull Off
       Pass
       {
           Name "BlitWithMaterialPass"

           HLSLPROGRAM
           #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
           #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"

           #pragma vertex Vert
           #pragma fragment Frag

           // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.
           float4 Frag(Varyings input) : SV_Target0
           {
               // this is needed so we account XR platform differences in how they handle texture arrays
               UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);

               // sample the texture using the SAMPLE_TEXTURE2D_X_LOD
               float2 uv = input.texcoord.xy;
               half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
            
               // Modify the sampled color
               return half4(0, 1, 0, 1) * color;
           }

           ENDHLSL
       }
   }
}
3 Likes

Is there going to be a period where both routes are supported, or is the intention to move URP wholly to rendergraph when the time comes? I’d seen the github repo previously where it had elements of both, but it wasnt clear if that was just while getting things running (there was a fair bit of duplication as a result)

You can also try out the KeepFrame sample in the URP package samples. It’s upgraded to RenderGraph.

The idea is to have this on by default in 23.3. We’re still building confidence to make this decision though.

It’s indeed the intention to move URP wholly to RenderGraph when the time comes.

I hope “ScriptableRendererFeature” will be removed? Because in HDRP I can use simple “volume” feature in runtime, without manual adding 100500 features through editor.
Ps, right now the only way it’s use “UniversalAdditionalCameraData.scriptableRenderer.EnqueuePass”

I hope with render graph I can use the same universal custompass API for urp/hdrp?
or will there be 2 different versions again ?

if you plan to completely break the old URP API, I will be glad if it is a single API for URP and HDRP. I’m begging.

1 Like

Hello everyone, here’s a link to an example on How to Blit using Render Graph API and Blitter API. Let us know about any API feedback and we will update the API and docs.

9385658–1312931–DrawFullscreen pass with URP and Render Graph [Public].pdf (1.69 MB)

2 Likes

The thing that worries me most about API changes to URP is if it causes a loss of functionality. If we identify things that you can’t do with the new API that you could do with the old, will the team be receptive to those changes? Historically it feels like most URP suggestions are ignored - like the pipeline is going wherever it’s been decided to go, regardless of what users expect from it.

The render pass interface RecordRenderGraph(RG, ContextContainer FrameData) has been designed so it can be adopted by HDRP. HDRP currently doesn’t expose RenderGraph in the HDRP CustomPass. In 23, HDRP will not adopt it yet but for the next version we have planned to unify the extension APIs indeed using this new interface.

1 Like

Yes very much so. Our goal is to not have any functional regressions, you should be able to do more with the new API, not less. It’s a top priority to fix if you would find some regression. However, the old API was not as thoroughly designed and offers much less guardrails. So some things might have worked by accident (on some platforms) that now the more strict API could prevent.

2 Likes

you should be able to do more things with the new API that before were not possible, i.e. accessing the actual RTHandle of every single resource, or using frame buffer fetch by and native render passes enabled by default on TBDR devices.

As Aljosha said, there might have been “undefined behaviours”/hacks that worked out of luck before, being undefined or technically incorrect. In those cases you would need to find a proper way to implement it, since the API now is much more safe and as a consequence more strict.

Of course if there would be any missing valid functionality our priority is to fix it ASAP and that’s why we are asking for feedback ahead of time

3 Likes

Adding few more preview samples:

How to draw geometry using RendererLists + RenderGraph (replacing the old cmd.DrawRenderers)

using System.Collections.Generic;
using UnityEngine;
using UnityEngine.Experimental.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.RendererUtils;
using UnityEngine.Rendering.Universal;

public class RenderListRenderFeature : ScriptableRendererFeature
{
    class RendererListPass : ScriptableRenderPass
    {
        // Layer mask used to filter objects to put in the renderer list
        private LayerMask m_LayerMask;
       
        // List of shader tags used to build the renderer list
        private List<ShaderTagId> m_ShaderTagIdList = new List<ShaderTagId>();

        public RendererListPass(LayerMask layerMask)
        {
            m_LayerMask = layerMask;
        }
       
        // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
        private class PassData
        {
            public RendererListHandle rendererListHandle;
        }

        // Sample utility method that showcases how to create a renderer list via the RenderGraph API
        private void InitRendererLists(ContextContainer frameData, ref PassData passData, RenderGraph renderGraph)
        {
            // Access the relevant frame data from the Universal Render Pipeline
            UniversalRenderingData universalRenderingData = frameData.Get<UniversalRenderingData>();
            UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
            UniversalLightData lightData = frameData.Get<UniversalLightData>();
           
            var sortFlags = cameraData.defaultOpaqueSortFlags;
            RenderQueueRange renderQueueRange = RenderQueueRange.opaque;
            FilteringSettings filterSettings = new FilteringSettings(renderQueueRange, m_LayerMask);
           
            ShaderTagId[] forwardOnlyShaderTagIds = new ShaderTagId[]
            {
                new ShaderTagId("UniversalForwardOnly"),
                new ShaderTagId("UniversalForward"),
                new ShaderTagId("SRPDefaultUnlit"), // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
                new ShaderTagId("LightweightForward") // Legacy shaders (do not have a gbuffer pass) are considered forward-only for backward compatibility
            };
           
            m_ShaderTagIdList.Clear();
           
            foreach (ShaderTagId sid in forwardOnlyShaderTagIds)
                m_ShaderTagIdList.Add(sid);
           
            DrawingSettings drawSettings = RenderingUtils.CreateDrawingSettings(m_ShaderTagIdList, universalRenderingData, cameraData, lightData, sortFlags);

            var param = new RendererListParams(universalRenderingData.cullResults, drawSettings, filterSettings);
            passData.rendererListHandle = renderGraph.CreateRendererList(param);
        }

        // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
        static void ExecutePass(PassData data, RasterGraphContext context)
        {
            context.cmd.ClearRenderTarget(RTClearFlags.Color, Color.green, 1,0);
           
            context.cmd.DrawRendererList(data.rendererListHandle);
        }
       
        // This is where the renderGraph handle can be accessed.
        // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
        public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
        {
            string passName = "RenderList Render Pass";
           
            // This simple pass clears the current active color texture, then renders the scene geometry associated to the m_LayerMask layer.
            // Add scene geometry to your own custom layers and experiment switching the layer mask in the render feature UI.
            // You can use the frame debugger to inspect the pass output

            // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
            using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
            {
                // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
                // The active color and depth textures are the main color and depth buffers that the camera renders into
                UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
               
                // Fill up the passData with the data needed by the pass
                InitRendererLists(frameData, ref passData, renderGraph);
               
                // Make sure the renderer list is valid
                if (!passData.rendererListHandle.IsValid())
                    return;
               
                // We declare the RendererList we just created as an input dependency to this pass, via UseRendererList()
                builder.UseRendererList(passData.rendererListHandle);
               
                // Setup as a render target via UseTextureFragment and UseTextureFragmentDepth, which are the equivalent of using the old cmd.SetRenderTarget(color,depth)
                builder.UseTextureFragment(resourceData.activeColorTexture, 0);
                builder.UseTextureFragmentDepth(resourceData.activeDepthTexture, IBaseRenderGraphBuilder.AccessFlags.Write);

                // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecutePass(data, context));
            }
        }
    }

    RendererListPass m_ScriptablePass;

    public LayerMask m_LayerMask;

    /// <inheritdoc/>
    public override void Create()
    {
        m_ScriptablePass = new RendererListPass(m_LayerMask);

        // Configures where the render pass should be injected.
        m_ScriptablePass.renderPassEvent = RenderPassEvent.AfterRenderingOpaques;
    }

    // Here you can inject one or multiple render passes in the renderer.
    // This method is called when setting up the renderer once per-camera.
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        renderer.EnqueuePass(m_ScriptablePass);
    }
}
3 Likes

Framebuffer fetch sample:

Feature:

using UnityEngine;
using UnityEngine;
using UnityEngine.Experimental.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
using UnityEngine.Serialization;

public class FrameBufferFetchRenderFeature : ScriptableRendererFeature
{
    class FrameBufferFetchPass : ScriptableRenderPass
    {
        private Material m_BlitMaterial;
        private Material m_FBFetchMaterial;
        
        public FrameBufferFetchPass(Material blitMaterial, Material fbFetchMaterial)
        {
            m_BlitMaterial = blitMaterial;
            m_FBFetchMaterial = fbFetchMaterial;
        }
        
        // This class stores the data needed by the pass, passed as parameter to the delegate function that executes the pass
        private class PassData
        {
            internal TextureHandle src;
            internal Material material;
        }

        // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
        static void ExecuteBlitPass(PassData data, RasterGraphContext context)
        {
            Blitter.BlitTexture(context.cmd, data.src, new Vector4(1, 1, 0, 0), data.material, 0);
        }
        
        // This static method is used to execute the pass and passed as the RenderFunc delegate to the RenderGraph render pass
        static void ExecuteFBFetchPass(PassData data, RasterGraphContext context)
        {
            context.cmd.DrawProcedural(Matrix4x4.identity, data.material, 1, MeshTopology.Triangles, 3, 1, null);
            
            // other ways to draw a fullscreen triangle/quad:
            //CoreUtils.DrawFullScreen(context.cmd, data.fbFetchMaterial, null, 1);
            //Blitter.BlitTexture(context.cmd, new Vector4(1, 1, 0, 0), data.fbFetchMaterial, 1);
        }

        private void BlitPass(RenderGraph renderGraph, ContextContainer frameData, TextureHandle destination)
        {
            string passName = "InitialBlitPass";
            
            // This simple pass copies the active color texture to a new texture using a custom material. This sample is for API demonstrative purposes,
            // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.

            // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
            using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
            {
                // UniversalResourceData contains all the texture handles used by the renderer, including the active color and depth textures
                // The active color and depth textures are the main color and depth buffers that the camera renders into
                UniversalResourceData resourceData = frameData.Get<UniversalResourceData>();
                
                // Get the active color texture through the frame data, and set it as the source texture for the blit
                passData.src = resourceData.activeColorTexture;
                passData.material = m_BlitMaterial;
                
                // We declare the src texture as an input dependency to this pass, via UseTexture()
                builder.UseTexture(passData.src);

                // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
                builder.UseTextureFragment(destination, 0);
                
                // We disable culling for this pass for the demonstrative purpose of this sample, as normally this pass would be culled,
                // since the destination texture is not used anywhere else
                builder.AllowPassCulling(false);

                // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecuteBlitPass(data, context));
            }
        }
        
        private void FBFetchPass(RenderGraph renderGraph, ContextContainer frameData, TextureHandle source, TextureHandle destination)
        {
            string passName = "FrameBufferFetchPass";
            
            // This simple pass copies the target of the previous pass to a new texture using a custom material and framebuffer fetch. This sample is for API demonstrative purposes,
            // so the new texture is not used anywhere else in the frame, you can use the frame debugger to verify its contents.

            // add a raster render pass to the render graph, specifying the name and the data type that will be passed to the ExecutePass function
            using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
            {
                // Fill the pass data
                passData.material = m_FBFetchMaterial;
                
                // We declare the src texture as an input dependency to this pass, via UseTexture()
                //builder.UseTexture(passData.blitDest);
                builder.UseTextureFragmentInput(source, 0, IBaseRenderGraphBuilder.AccessFlags.Read);

                // Setup as a render target via UseTextureFragment, which is the equivalent of using the old cmd.SetRenderTarget
                builder.UseTextureFragment(destination, 0);
                
                // We disable culling for this pass for the demonstrative purpose of this sample, as normally this pass would be culled,
                // since the destination texture is not used anywhere else
                builder.AllowPassCulling(false);

                // Assign the ExecutePass function to the render pass delegate, which will be called by the render graph when executing the pass
                builder.SetRenderFunc((PassData data, RasterGraphContext context) => ExecuteFBFetchPass(data, context));
            }
        }
        
        // This is where the renderGraph handle can be accessed.
        // Each ScriptableRenderPass can use the RenderGraph handle to add multiple render passes to the render graph
        public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
        {
            // This pass showcases how to implement framebuffer fetch: this is an advanced TBDR GPU optimization
            // that allows subpasses to read the output of previous subpasses directly from the framebuffer, reducing greatly the bandwidth usage.
            // The first pass BlitPass simply copies the Camera Color in a temporary render target, the second pass FBFetchPass copies the temporary render target
            // to another render target using framebuffer fetch.
            // As a result, the passes are merged (you can verify in the RenderGraph Visualizer) and the bandwidth usage is reduced, since we can discard the temporary render target.

            // The destination textures are created here, 
            // the texture is created with the same dimensions as the active color texture, but with no depth buffer, being a copy of the color texture
            // we also disable MSAA as we don't need multisampled textures for this sample.
                
            UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
            RenderTextureDescriptor desc = cameraData.cameraTargetDescriptor;
            desc.msaaSamples = 1;
            desc.depthBufferBits = 0;
                
            TextureHandle blitDestination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "BlitDestTexture", false);
            TextureHandle fbFetchDestination = UniversalRenderer.CreateRenderGraphTexture(renderGraph, desc, "FBFetchDestTextureTexture", false);
            
            BlitPass(renderGraph, frameData, blitDestination);
            
            FBFetchPass(renderGraph, frameData, blitDestination, fbFetchDestination);
        }
    }

    FrameBufferFetchPass m_FbFetchPass;
    
    public Material m_BlitColorMaterial;
    public Material m_FBFetchMaterial;

    /// <inheritdoc/>
    public override void Create()
    {
        m_FbFetchPass = new FrameBufferFetchPass(m_BlitColorMaterial, m_FBFetchMaterial);

        // Configures where the render pass should be injected.
        m_FbFetchPass.renderPassEvent = RenderPassEvent.BeforeRenderingTransparents;
    }

    // Here you can inject one or multiple render passes in the renderer.
    // This method is called when setting up the renderer once per-camera.
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        renderer.EnqueuePass(m_FbFetchPass);
    }
}

Shader:

Shader "FrameBufferFetch"
Shader "FrameBufferFetch"
{
   SubShader
   {
       Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
       ZWrite Off Cull Off
       Pass
       {
           Name "InitialBlit"

           HLSLPROGRAM
           #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
           #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"

           #pragma vertex Vert
           #pragma fragment Frag

           // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.  
           float4 Frag(Varyings input) : SV_Target0
           {
               // this is needed so we account XR platform differences in how they handle texture arrays
               UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);

               // sample the texture using the SAMPLE_TEXTURE2D_X_LOD
               float2 uv = input.texcoord.xy;
               half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
              
               // Modify the sampled color
               return color;
           }

           ENDHLSL
       }
      
       Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
       ZWrite Off Cull Off
       Pass
       {
           Name "FrameBufferFetch"

           HLSLPROGRAM
           #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
           #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"

           #pragma vertex Vert
           #pragma fragment Frag

           FRAMEBUFFER_INPUT_X_HALF(0);

           // Out frag function takes as input a struct that contains the screen space coordinate we are going to use to sample our texture. It also writes to SV_Target0, this has to match the index set in the UseTextureFragment(sourceTexture, 0, …) we defined in our render pass script.  
           float4 Frag(Varyings input) : SV_Target0
           {
               // this is needed so we account XR platform differences in how they handle texture arrays
               UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);

               // read the current pixel from the framebuffer
               float2 uv = input.texcoord.xy;
               half4 color = LOAD_FRAMEBUFFER_X_INPUT(0, input.positionCS.xy);
              
               // Modify the sampled color
               return half4(0,0,1,1) * color;
           }

           ENDHLSL
       }
   }
}

how to draw geometry without hardcoded culling? How to supply our own list or renderers/meshes/submeshes?

cmd.DrawMesh can be used as usual:
https://docs.unity3d.com/ScriptReference/Rendering.CommandBuffer.DrawMesh.html

it will not add any lighting or anything from scene it is not replacement to draw renderers

RendererList is the new API used for this by both URP and HDRP and gives you the same functionality of DrawRednderers

https://docs.unity3d.com/ScriptReference/Rendering.RendererList.html

note that this is not a RG related change, URP has been using RendererLists since 22

Is there any plan to make any API alternatives that are a bit less boilerplaty? I guess I could just copy-paste your example, but it feels like a bit much to require over 50 lines of code (without whitespace or comments!) to implement “get a named asset that blits a material to the screen”.

All in all this looks good, but I’d love to see some higher level features. That’d achieve two things:

  • Easier to find simple versions of the feature, so using this is achievable without deep knowledge of a pretty low-level API
  • A requirement for you to maintain the high level feature so we don’t have to rewrite our code every Unity version update if we just want a simple blit to screen.
10 Likes

yeah, as you can see most of the RG setup code across the different samples I posted is 90% the same. We plan to add as many as possible high level wrappers to do the most common operations, so eventually the average user render feature should become few lines of code, i.e. Blit(rg, source, target, material). This way “high level” non advanced users ideally shouldn’t even be exposed to the RG itself at all.

The low level API is more verbose and powerful and allows for much more customization, but for sure the next step on our side will be about making it more user friendly

Being this a call for early feedback we just want users to start using the low level API and give feedback on that

3 Likes