How to Blit in URP 12 - Documentation Needed

Please see: URP 14 Blitter Overview

Edit #2: A lot has changed since I posted this. See this post and others for more recent details about changes for Blitting, the SRP Blitter Class, and other miscellaneous details about RTHandles.

Edit #1: Jump to @ManueleB 's message for a summary of URP blitting as of Unity 2021/URP 12.

I don’t quite understand why there is so little communication or documentation for blitting in SRP?

Recently there was a new section added to the documentation – “How To” – which includes a blit tutorial with example code: How to perform a full screen blit in Single Pass Instanced rendering in XR

Funnily enough, the example code doesn’t use any of Unity’s many different included blit functions.

I recently talked about this on another __ forum post __.

I know there are currently issues with cmd.Blit() and XR but that is no reason for there to be so few examples of using blit outside of internal SRP code and the latest Photomode package.

In the Photomode package the ScriptableRenderPass BlitRenderPass uses Blit(CommandBuffer, source, destination, Material, pass index);.

However, the destination texture (“_AfterPostProcessTexture”) is an existing texture from the SRP pipeline; not a user-created texture. Anecdotally, it feels like I see posts about custom shaders/materials consistently failing with blit, either resulting in gray or black textures being blit to the screen.

There are very few examples where the blit destination is a user-created texture.

One example of this can be found in __ this HDRP post __. In that specific case, the solution was to use the XR macro SAMPLE_TEXTURE2D_X since HDRP uses XR macros for render targets to support stereo rendering.*

*Those macros are defined differently (in Core.hlsl) based on the texture dimension set when the RTHandle system is initialized.

8 Likes

I made some example about this a while ago. I put the old style cmd.blit approach there first and you can then follow the commits to see what kind of changes are required or possible for upgrading it to the current setup: Commits · 0lento/URP_RendererFeature_SPI_Example · GitHub

if you don’t want to use OpaqueTexture, just ignore the last commit there. Also worth noting that 2022.1+ URP will have a HDRP style RT handling scheme so that will change slightly there (doesn’t require huge changes to simplified sample like this).

1 Like

I agree we need some official blog post or something. I’ve seen a lot of Unity internal code being inconsistent and doing stuff like this:

Camera camera = rd.cameraData.camera;
cmd.SetViewProjectionMatrices(Matrix4x4.identity, Matrix4x4.identity);
cmd.DrawMesh(RenderingUtils.fullscreenMesh, Matrix4x4.identity, material, 0, pass);
cmd.SetViewProjectionMatrices(camera.worldToCameraMatrix, camera.projectionMatrix);

This is blit-like, but won’t set all the same states, so might work differently if MSAA is on vs off (easy to fix in the shader, but a real WTF moment if it comes up 2 months after you wrote the original code).

(EDIT: that’s what the page linked by OP suggests – drawing the mesh directly instead of using Blit)

1 Like

I forgot to mention how many different blit shaders and hlsl files are used as well. It makes sense to have so many blit options considering how versatile and ubiquitous it is throughout the render pipeline. However that doesn’t make it any less confusing. Many of the internal passes have their own ways of blitting including drawing the mesh directly.

Here is an assortment of many internal files I could remember and those that appear when you search for blit in the Graphics repository. All files listed are hyperlinks.
SRP Core

Core RP:
/Runtime/Utilities/
Blit.hlsl
BlitColorAndDepth.hlsl
Blitter.cs - mentioned previously.
CoreUtils.cs

URP

Universal RP:
/Shaders
/Utils
Blit.shader
CopyDepth.shader
CopyDepthPass.hlsl
CoreBlit.shader
CoreBlitColorAndDepth.shader - currently unused.
Fullscreen.hlsl
/PostProcessing
• many files in this folder contain TEXTURE2D_X(_SourceTex) which appears in other URP Blit implementations.
/Runtime
/Passes
CopyColorPass.cs
CopyDepthPass.cs
FinalBlitPass.cs
PostProcessPass.cs
/RendererFeatures
ScreenSpaceShadows.cs - uses cmd.Blit();
RenderingUtils.cs

HDRP

High Definition RP:
/Runtime
/Compositor/Shaders
CustomClear.shader
/Core/CoreResources
BlitCubeTextureFace.shader
/Lighting/Shadow/
ShadowBlit.shader
/Debug
DebugBlitQuad.shader
DebugVTBlit.shader
/RenderPipeline/Utility/
HDUtils.cs
/ShaderLibrary
Blit.shader
BlitColorAndDepth.shader
BlitCubemap.shader
CopyDepthBuffer.shader

Post Processing

Post Processing:
/PostProcessing
/Shaders/Builtins/
CopyStd.shader
CopyStdFromDoubleWide.shader
CopyStdFromTexArray.shader
/Runtime/Effects
Bloom.cs - uses cmd.BlitFullScreenTriangle();
DepthOfField.cs - uses cmd.BlitFullScreenTriangle();

and many more.

(As of December 14, 2021)

April 13, 2023 Edit:
Fixed URLs to redirect to correct directories in Unity-Technologies/Graphics.
Since posting this, Unity reorganized the Graphics repo and added a new Packages/ subfolder.

3 Likes

For historical reasons URP doesn’t currently have a proper standard in terms of “how to Blit”.
I agree that this is very confusing, and it is something we are trying to document better as we speak.
This is an important ongoing discussion at the moment, since reviewing the state of our blits is also an important preparation step for some core changes in URP, better NativeRenderPass and RenderGraph support. Following these best practices early on existing projects should also make your life easier when upgrading to the next URP releases.

I’ll try to do a quick summary here, but you should expect documentation coming soon:

  1. cmd.Blit() should not be used. The main reason is that it is a bit of a “black box”, since it contains a lot of built-in logic, in terms of changing states, binding textues and setting render targets. All of these are happening under the hood, so not transparently from an SRP point of view, which can cause some issues. Other big issues with cmd.Blit(): it “breaks” NativeRenderPass and RenderGraph compatibility, so any pass using cmd.Blit will not be able to take advantage of these. It also doesn’t work well in XR. Its usage might also be deprecated in future URP versions.

  2. the same applies obviously to any utilities/wrappers relying on cmd.Blit() internally, so for example RenderingUtils.Blit should be avoided as well

  3. The current How to perform a full screen blit in Single Pass Instanced rendering in XR is a good example to follow. Under the hood cmd.Blit() does pretty much the same, except in this case everything is handled at the SRP level, which is the way to go. I think the fact that the page is mentioning XR is a bit confusing, since this is a perfectly valid way of doing blit on all platforms. So I am looking at updating that page. It is also in need of some sample code changes* since in the current state it doesn’t work anymore on 22.1, because of the recently introduced RTHandles support.

  4. The SRP Blit API “to use” is Core Blitter: it is already used by other pipelines, and refactoring our existing passes so that they use it instead of cmd.Blit is in our short-term roadmap. This might also include modifying and improving the Blitter API to accomodate any URP requirements. So expect some changes soon in this area

*a quick preview of the sample code changes needed for RTHandles support:

Replace the content of AddRenderPasses and override the new SetupRenderPasses callback (more info in the RTHandles upgrade guide)

public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
        {
            if (renderingData.cameraData.cameraType == CameraType.Game)
                renderer.EnqueuePass(m_RenderPass);
        }
   
        public override void SetupRenderPasses(ScriptableRenderer renderer, in RenderingData renderingData)
        {
            if (renderingData.cameraData.cameraType == CameraType.Game)
            {
                //Calling ConfigureInput with the ScriptableRenderPassInput.Color argument ensures that the opaque texture is available to the Render Pass
                m_RenderPass.ConfigureInput(ScriptableRenderPassInput.Color);
                m_RenderPass.SetTarget(renderer.cameraColorTarget, m_Intensity);
            }
        }
10 Likes

Thank you so much for the transparency and all the information! It’s exciting to see URP going through such big changes in the last two major releases.

I look forward to seeing Core Blitter integrated with URP passes. Expanding support for NativeRenderPass and RenderGraph, as well as the shift to RTHandles, seems like it might make writing our own passes much easier.

I’m not convinced this will make actual usage simpler. Unity could provide more helper functions to reduce the current boilerplate - sure - but changes so far haven’t reduced much of our own code complexity, it’s just structured in slightly different way.

IMHO biggest thing to take away the pain from engine users would just be to give more varied examples on how one can use this. That XR doc page is a good start but it doesn’t explain everything. Would be nice to have set of few renderer feature samples as part of URP examples for example that specifically had to handle more intermediate processing which you typically see on more advanced effects.

I’m also personally hoping that the RG change will happen sooner than later so we can finally get closer to more mature code base with URP.

3 Likes

I was looking at this page on PR for the doc update:
https://github.com/Unity-Technologies/Graphics/blob/urp/blit_doc_update/com.unity.render-pipelines.universal/Documentation~/renderer-features/blit-best-practices.md
I’m actually now more confused than enlightened. According your comment here and what was written on that PR’s page, SRP Blitter API would be the way to go yet the only practical example is for the cmd.DrawMesh.

Why even have cmd.DrawMesh example if we are not supposed to use it? Why not have example for the recommended route?

Where is it recommended to not use cmd.DrawMesh? I thought the recommendation was not to use cmd.Blit(); or any functions that wrap or rely on cmd.Blit();

1 Like

They instead recommended SRP Blitter:

I just realized that SRP Blitter uses RTHandles, so it wouldn’t have even worked prior 2022.1’s URP so I guess this all makes more sense. So my assumption is that once this version matures further, SRP Blitter will get adopted more in URP (+ I do get that the mentioned doc page is not something we actually have in use yet).

Would expect the blitting example to use the Blitter api if it’s being recommended but I get these things don’t happen over night :slight_smile:

on top of the RTHandles availability issue, which make SRP Blitter not usable on versions < 22.1, there are still scenarios where you might be better off writing your “custom blit” using DrawMesh, like the how-to page shows.

SRP Blitter will be improved in the future to facilitate URP adoption, but for now for example you are unable to do things like setting explicitly Load/Store actions if you are optimizing for mobile.
With DrawMesh you could also write easily generic full screen quad renderers, which don’t necessaily need to blit a source texture to a destination one: as an example, a very common pattern currently for post processing effect doing similar stuff (i.e. ColorGradingLUT) is to call cmd.Blit(null, RT). This is not the best use of blit and works much better with a simple draw to full screen quad mesh.

Other examples of useful custom blit implementation could be something like the URP CopyDepth pass, where you need to add some extra MSAA resolve logic so your shader could use Tex2DMS samples based on whether the source texture is MSAA’d or not.

The SRP Blitter will be extended and improved to cover more use cases over time, but DrawMesh will always give you fulll flexibility and customization if needed.

TL;DR: Use SRP Blitter if what you need to do is available and implemented in that API, use the DrawMesh approach for any other cases, avoid cmd.Blit() :slight_smile:

5 Likes

forgot to add: we plan to add more samples using Blitter once we start converting all URP passes to use it

5 Likes

My game absolutely relies on Graphics.Blit, which works in 2021.1.28f1 and URP 11.0.0
I am blitting a Material to a Texture (not in a pass!!! Just to bake a VERY heavy dynamic material to a texture ONCE and then GetPixels() to an array), that’s it.

Graphics.Blit(null, renderTexture, material); no longer works.

What do I use now?

1 Like

I’m using DrawProcedural for Blit, with GetFullScreenTriangleTexCoord and GetFullScreenTriangleVertexPosition.
If you Blit with DrawProcedural, you must cancel the flip that by UNITY_UV_STARTS_AT_TOP if you blit from RT to RT, but that’s it.
The flip for the blit with DrawMesh looks more complicated. You must consider current camera matrix…

I know it’s a lot to ask but do you have a code snippet?

//Script
_BlitMaterial.SetTexture(Shader.PropertyToID("_BlitSrcTex"), _SrcTexture);
_CmdBuffer.SetRenderTarget(new RenderTargetIdentifier(_DstTexture), RenderBufferLoadAction.DontCare, RenderBufferStoreAction.Store);
_CmdBuffer.DrawProcedural(Matrix4x4.identity, _BlitMaterial, 0, MeshTopology.Triangles, 3, 1, null);

//VertexShader
output.positionCS = GetFullScreenTriangleVertexPosition(input.vertexID);
float2 uv = GetFullScreenTriangleTexCoord(input.vertexID);
#if defined _CANCEL_FLIP
uv.y = 1.0 - uv.y;
#endif
output.xyAndUv = uv.xyxy * float4(DST_TEXTURE_SIZE, 1.0, 1.0);

//FragmentShader
float4 color = LOAD_TEXTURE2D(_BlitSrcTex, input.xyAndUv.xy);
// ..and/or..
float4 color = SAMPLE_TEXTURE2D_LOD(_BlitSrcTex, s_linear_clamp_sampler, input.xyAndUv.zw, 0.0);

And I set _CANCEL_FLIP flag manually from script, if the source texture is a texture (not a renderTexture) or destination texture is a backBuffer, and running on non-Open GL api.
Sample works just fine, but Load is useful in some case like resolving MSSA buffer.

URP has good samples to use DrawProcedural Quad for blit. Script side , Shader side
HDRP has good samples to use DrawProcedural Triangle for blit. Script side , Shader side

2 Likes

I’m afraid this is out of my depth. I am using shader graph to calculate millions of distance calculations to fill a voxel array. Used to be, I just did Graphics.Blit(null, renderTexture, material); and then ReadPixels(); then GetPixels() which fills the array. Doing these millions of calculations linearly on CPU to fill the array instead (with all of the other calcs that go into it, this is simplified) takes 8-20 seconds depending on the volume. 3 million volume on GPU takes .8s.

I do not have a source texture, I just want to copy the current state of the procedural material to a Texture2D. It was as easy as Blit(null, renderTexture, material), done. Now I have no idea what to do with the above code, as it won’t do anything without a source texture. Also I’m using shadergraph and the output shader code is thousands of lines long (as I said, the above explanation is simplified, there’s a LOT that goes into determining the values of the distance calculation).

The link to the best practices for blit is missing, dont you know at what branch/commit can it be found? I am trying to find out how to correctly use Blitter API and dont understand yet. I corrected the example to work with DrawMesh and sent it to Unity team with feedback form. But I cant see any example of Blitter usage

There is just out of dated example with XR and fullscreen rect cmd.DrawMesh in official documentation:
https://docs.unity3d.com/Packages/com.unity.render-pipelines.universal@13.1/manual/how-to.html?q=blit

1 Like

@ManueleB Sorry for bumping, could you please give the direction where to search for any examples of Blitter Api usage? Maybe Unity have any public repository with examples on it? or maybe special branch inside Graphics Repo?