Introduction of Render Graph in the Universal Render Pipeline (URP)

Hi, thanks a lot for the insight

I tried the change and not get the same result unfortunately

This is with the Temporal AA as i had it, it hides the artifacts correctly

This is the artifacts without using the temporal AA

This is what i get if i dont do the 3 blits and do the direct assignment instead

This is the code in the pass i use to do the copy of the textures

	Pass //2 BLIT BACKGROUND
			{
				Name "ColorBlitPasss"
				HLSLPROGRAM
				#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
				//#include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
				#include "BlitTAA.hlsl"//v0.2
				#pragma vertex Vert
				#pragma fragment Frag
				//
				float4 Frag(VaryingsB input) : SV_Target0
				{
					// this is needed so we account XR platform differences in how they handle texture arrays
					UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
					// sample the texture using the SAMPLE_TEXTURE2D_X_LOD
					float2 uv = input.texcoord.xy;
					half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
					// Inverts the sampled color
					//return half4(1, 1, 1, 1) - color;
					return color;
				}
				ENDHLSL
			}

Note that this code below worked in the pre-RenderGraph system

//Ping pong
RenderTexture temp2 = temp;
temp = temp1;
temp1 = temp2;

Hm, the native TAA or STP doesn’t help here?

It actually does, but there is a few cases where does not work or need to control the event it goes into. The issue is with the jitter that may conflict with other temporal AA implementations that are per module.

The Temporal AA i use is also much more stable then STP, so is more ideal, is same as the Unity Temporal on camera and little better

After some testing i realized that the system works even if use only the below blit alone than all three

Not sure why as it deviates from the non graph version, but seem to work and largely resolve the performance loss issue as well

   passName = "SAVE TEMP";
                using (var builder = renderGraph.AddRasterRenderPass<PassData>(passName, out var passData))
                {
                    //passData.src = resourceData.activeColorTexture; //SOURCE TEXTURE//
                    passData.src = resourceData.activeColorTexture;
                    desc.msaaSamples = 1; desc.depthBufferBits = 0;
                    //builder.UseTexture(passData.src, IBaseRenderGraphBuilder.AccessFlags.Read);
                   // builder.UseTexture(_handleTAA2, AccessFlags.Read);
                    builder.SetRenderAttachment(_handleTAA, 0, AccessFlags.Write);
                   // builder.AllowPassCulling(false);
                    passData.BlitMaterial = m_BlitMaterial;
                  //  builder.AllowGlobalStateModification(true);
                    builder.SetRenderFunc((PassData data, RasterGraphContext context) =>
                        ExecuteBlitPass(data, context, 1, passData.src));
                }

I have now managed to get it all working fully native in RenderGraph and seems can be super fast after all the fixes :slight_smile:

Many thanks for the help and insight on the random write methods through RenderGraph, it is very appreciated.

6 Likes

This is great to hear! Thanks that you kept pushing and for the feedback. It is encouraging that this thread could help and that the transition has positive results.

Have a good week everyone!

1 Like

Congratulations @nasos_333! Glad to hear that you managed to make it work in the end :slight_smile:

1 Like

I have to say i was surprised by how it all worked directly after realized what had to do to incorporate the 3D texture rendering into the RenderGraph, it is really very powerful and much easier as well because of no need to handle the texture clearing.

And is same performant or more now that perfected all work, which is amazing :slight_smile:

Thanks again for the guidance on this, really helped a lot find the right way to go about addressing the issue :slight_smile:

5 Likes

I’m trying to understand how rendergraph works by readingcodes From URP Samples(like DLC in packages),something weird happened.
In rendergraph sample - FramebufferFetch which shows how to get activeColorTexture B channel and copy it back,to introduce how fetch works.
first step works well,but in copypass ,the _UnityFBInput0 might changed to depth texture but not our results…
I try to change copy pass to Blitpass ,except for losing fetch, it works fine.(Better to write another fetchPass like sample does,but Isn’t that what copypass does?).
The same problem occurred with sample-copyRenderFeature, which use addcopypass,sometimes it will change to depth,Sometimes works well…
my version is 6000.23 and use urp template project.


how copypass actually works?depth texture is error fallback?
It’s really frustrating to have an error in the sample(because I’m wondering if I’ve got something wrong until I find out why)

what graphics api are you using? there is currently an issue with the framebuffer fetch fallback on directX. On graphics APIs that FBF is not natively supported, we automatically handle that using a regular texture sample but there are some issues with that currently. Does it work if you switch to vulkan? the fix should land this week, and can be available in patch release in a few weeks.

I’m using default DX11, unity6-0.23, URP17.0.3,URP template project, PC-RPAsset, RTX2060.The code is FrameBufferFetchRenderFeature in RenderGraph of URPSample

After taking your advice I switched graphics APIs to DX12 and vulkan also wrong,It doesn’t look like the graphics API is causing the problem.
I tried to tweak some settings and found some issues-
The first is that in RPAsset if you set Opaque DownSampling to none (or just turn off the OpaqueTexture setting) the effect is correct.
Then I tried to modify the injection queue for the pass which original code was BeforeRenderingTransparent (450 in enum)
Environment: DX11 DX12 vulkan cause same error(not your guys fault lol), PCAsset

First, the Fallback with the red Depth is not affected by the pipeline turning off the DepthTexture.
At 300-399, regardless of whether OpaqueTexture is on or not, the Scene view is correct, and the Game view (MainCamera) is still Depth.
At 450-499, the effect is correct with OpaqueTexture off, but error when it is on,in both Scene and Game.
It’s working fine at other ranges.

I noticed that when I turn on OpaqueTexture there is an extra CopyColor Pass, is the problem related to this pass? If it’s not related to the graphics API, could it be a problem somewhere in the code? Hope this helps.
Anyway, I can get it working correctly now, thanks for the suggestion~. :slight_smile:

ah missed that you were using a depthTexture. Framebuffer fetch (that is used in the copyPass) doesn’t work with a depth format. In most cases the depth copy is a color format, but not always (for example when using a prepass). You need to use BlitPass instead.

nuh, I think there may be some misunderstanding.
After I saw the result of Depth appearing, I tried to find out if Depth was used anywhere in the C# RF code or Shader (there is actually no code for it) I turned DepthTexture off for the pipeline and checked anywhere including the Camera settings.
In my environment The CopyRenderFeature in the example (which also uses Copypass) has the same problem Going from ActiveColor to New Texture is no problem, but in reverse, it’s replaced with DepthTexture (DX11 U can see it below ,still no any Depth setting), switching to DX12 and vulkan shows that The buffer is replaced with Default2D but the result is exactly the same as DX11 (a weird Depth :scream:++ )
Even on another PC (not mine XD) with version 6000.23, where RPasset’s OpaqueTexture and DepthTexture are not set, but just a default post-processing template (the one that inverts the colors) is mounted it triggers the CopyRenderFeature bug!
I’m guessing that maybe this bug might have received the influence of the before and after Pass, it shouldn’t be so unstable or subject to many kinds of limitations, copyPass is supposed to be a simple easy to understand and good to use Pass.
But this is not a problem that can’t be solved. copyPass’s problems can be replaced with other Passes, I hope this gives you some help!

I am attempting to update this IMGUI Renderer Feature for the Render Graph: uimgui/Source/Renderer/RenderImGui.cs at main ¡ psydack/uimgui ¡ GitHub

It appears to work by injecting a CommandBuffer and executing within the ScriptableRenderContext, which it we don’t get access to anymore. Is there a simple way to inject and execute and CommandBuffer with the new Render Graph API?

using UnityEngine.Rendering;
#if HAS_URP
using UnityEngine.Rendering.Universal;
using UnityEngine;
#endif

namespace UImGui.Renderer
{
#if HAS_URP
	public class RenderImGui : ScriptableRendererFeature
	{
		private class CommandBufferPass : ScriptableRenderPass
		{
			public CommandBuffer commandBuffer;

			public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
			{
				context.ExecuteCommandBuffer(commandBuffer);
			}
		}

		[HideInInspector]
		public Camera Camera;
		public CommandBuffer CommandBuffer;
		public RenderPassEvent RenderPassEvent = RenderPassEvent.AfterRenderingPostProcessing;

		private CommandBufferPass _commandBufferPass;

		public override void Create()
		{
			_commandBufferPass = new CommandBufferPass()
			{
				commandBuffer = CommandBuffer,
				renderPassEvent = RenderPassEvent,
			};
		}

		public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
		{
			if (CommandBuffer == null) return;
			if (Camera != renderingData.cameraData.camera) return;

			_commandBufferPass.renderPassEvent = RenderPassEvent;
			_commandBufferPass.commandBuffer = CommandBuffer;

			renderer.EnqueuePass(_commandBufferPass);
		}
	}
#else
	public class RenderImGui : UnityEngine.ScriptableObject
	{
		public CommandBuffer CommandBuffer;
	}
#endif
}

Hi,

I get some issue when first load Unity, then goes away, showing two errors.
Though this error can lead to a crash if not load a new scene to reset it in some cases, so
can be bit critical, even though does not generally affecting anything after if handled properly.

In another project this leads to a new scene that has no skybox and only shows a blur background,
this is the case where if i load a pipeline etc crashes.

Thanks

possible solution, remake the scenes camera.

Indeed, or just starting a new scene before do any other action solves it. But since can lead to crashes, would be great if was handled by Unity to not happen at all.

Far as I know issues like this can come from upgrading regularly. For me it happened especially after major upgrades.

1 Like

Hi.

I’m porting some some old renderpasses to the new RenderGraph API.
I have a render pass that sets the camera projection matrices by calling RenderingUtils.SetViewAndProjectionMatrices(CommandBuffer), then draws some meshes by calling CommandBuffer.DrawMesh, using Unity’s default materials (Lit, SimpleLit, Unlit).

I see that all samples use raster render passes (RenderGraph.AddRasterRenderPass) instead of regular passes (RenderGraph.AddRenderPass). I couldn’t find any specific info about the difference between the two, but ChatGPT says that AddRasterRenderPass is the preferred and optimal way for regular drawing.

But a raster render pass uses RasterCommandBuffer instead of a regular CommandBuffer. RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) exists, but it is inaccessible.
What it does:

            Matrix4x4 viewAndProjectionMatrix = projectionMatrix * viewMatrix;
            cmd.SetGlobalMatrix(ShaderPropertyId.viewMatrix, viewMatrix);
            cmd.SetGlobalMatrix(ShaderPropertyId.projectionMatrix, projectionMatrix);
            cmd.SetGlobalMatrix(ShaderPropertyId.viewAndProjectionMatrix, viewAndProjectionMatrix);

The ShaderPropertyId is also inaccessible. I can clone the implementation of that function, hardcoding the property ids, but I fear it is prone to break silently when something changes in Unity rendering pipeline.

I wonder what is the best way forward. Should I use a regular render pass, or reimplement RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) myself? Maybe Unity should make that method public.

Hey @unity_9FD771E615CEC8514F84,

Can you have a look at this discussion: URP RenderGraph RenderPass - Render from Different Position/View Matrix - #6 by AMoulin.

TLDR; We made RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) accessible in 6000.0.30f1.

Regarding your question about the API: RenderGraph.AddRenderPass is an older API only used internally by HDRP, it is not supported in URP and shouldn’t be used in any URP ScriptableRendererFeature.

But I understand the confusion, we are planning to do a pass on the API documentation to make it more explicit. For now, just focus on RenderGraph.AddRasterRenderPass/AddComputePass/AddUnsafePass and the helpers RenderGraphUtils.AddBlitPass/AddCopyPass.

2 Likes