This is the code in the pass i use to do the copy of the textures
Pass //2 BLIT BACKGROUND
{
Name "ColorBlitPasss"
HLSLPROGRAM
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
//#include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
#include "BlitTAA.hlsl"//v0.2
#pragma vertex Vert
#pragma fragment Frag
//
float4 Frag(VaryingsB input) : SV_Target0
{
// this is needed so we account XR platform differences in how they handle texture arrays
UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
// sample the texture using the SAMPLE_TEXTURE2D_X_LOD
float2 uv = input.texcoord.xy;
half4 color = SAMPLE_TEXTURE2D_X_LOD(_BlitTexture, sampler_LinearRepeat, uv, _BlitMipLevel);
// Inverts the sampled color
//return half4(1, 1, 1, 1) - color;
return color;
}
ENDHLSL
}
Note that this code below worked in the pre-RenderGraph system
It actually does, but there is a few cases where does not work or need to control the event it goes into. The issue is with the jitter that may conflict with other temporal AA implementations that are per module.
The Temporal AA i use is also much more stable then STP, so is more ideal, is same as the Unity Temporal on camera and little better
This is great to hear! Thanks that you kept pushing and for the feedback. It is encouraging that this thread could help and that the transition has positive results.
I have to say i was surprised by how it all worked directly after realized what had to do to incorporate the 3D texture rendering into the RenderGraph, it is really very powerful and much easier as well because of no need to handle the texture clearing.
And is same performant or more now that perfected all work, which is amazing
Thanks again for the guidance on this, really helped a lot find the right way to go about addressing the issue
Iâm trying to understand how rendergraph works by readingcodes From URP Samples(like DLC in packages),something weird happened.
In rendergraph sample - FramebufferFetch which shows how to get activeColorTexture B channel and copy it backďźto introduce how fetch works.
first step works wellďźbut in copypass ďźthe _UnityFBInput0 might changed to depth texture but not our resultsâŚ
I try to change copy pass to Blitpass ďźexcept for losing fetch, it works fine.(Better to write another fetchPass like sample does,but Isnât that what copypass does?).
The same problem occurred with sample-copyRenderFeature, which use addcopypass,sometimes it will change to depth,Sometimes works wellâŚ
my version is 6000.23 and use urp template project.
how copypass actually worksďźdepth texture is error fallback?
Itâs really frustrating to have an error in the sample(because Iâm wondering if Iâve got something wrong until I find out why)
what graphics api are you using? there is currently an issue with the framebuffer fetch fallback on directX. On graphics APIs that FBF is not natively supported, we automatically handle that using a regular texture sample but there are some issues with that currently. Does it work if you switch to vulkan? the fix should land this week, and can be available in patch release in a few weeks.
Iâm using default DX11, unity6-0.23, URP17.0.3ďźURP template project, PC-RPAsset, RTX2060.The code is FrameBufferFetchRenderFeature in RenderGraph of URPSample
After taking your advice I switched graphics APIs to DX12 and vulkan also wrongďźIt doesnât look like the graphics API is causing the problem.
I tried to tweak some settings and found some issues-
The first is that in RPAsset if you set Opaque DownSampling to none (or just turn off the OpaqueTexture setting) the effect is correct.
Then I tried to modify the injection queue for the pass which original code was BeforeRenderingTransparent (450 in enum)
Environmentďź DX11 DX12 vulkan cause same error(not your guys fault lol), PCAsset
First, the Fallback with the red Depth is not affected by the pipeline turning off the DepthTexture.
At 300-399, regardless of whether OpaqueTexture is on or not, the Scene view is correct, and the Game view (MainCamera) is still Depth.
At 450-499, the effect is correct with OpaqueTexture off, but error when it is on,in both Scene and Game.
Itâs working fine at other ranges.
I noticed that when I turn on OpaqueTexture there is an extra CopyColor Pass, is the problem related to this pass? If itâs not related to the graphics API, could it be a problem somewhere in the code? Hope this helps.
Anyway, I can get it working correctly now, thanks for the suggestion~.
ah missed that you were using a depthTexture. Framebuffer fetch (that is used in the copyPass) doesnât work with a depth format. In most cases the depth copy is a color format, but not always (for example when using a prepass). You need to use BlitPass instead.
nuh, I think there may be some misunderstanding.
After I saw the result of Depth appearing, I tried to find out if Depth was used anywhere in the C# RF code or Shader (there is actually no code for it) I turned DepthTexture off for the pipeline and checked anywhere including the Camera settings.
In my environment The CopyRenderFeature in the example (which also uses Copypass) has the same problem Going from ActiveColor to New Texture is no problem, but in reverse, itâs replaced with DepthTexture (DX11 U can see it below ďźstill no any Depth setting), switching to DX12 and vulkan shows that The buffer is replaced with Default2D but the result is exactly the same as DX11 (a weird Depth ++ )
Even on another PC (not mine XD) with version 6000.23, where RPassetâs OpaqueTexture and DepthTexture are not set, but just a default post-processing template (the one that inverts the colors) is mounted it triggers the CopyRenderFeature bug!
Iâm guessing that maybe this bug might have received the influence of the before and after Pass, it shouldnât be so unstable or subject to many kinds of limitations, copyPass is supposed to be a simple easy to understand and good to use Pass.
But this is not a problem that canât be solved. copyPassâs problems can be replaced with other Passes, I hope this gives you some help!
It appears to work by injecting a CommandBuffer and executing within the ScriptableRenderContext, which it we donât get access to anymore. Is there a simple way to inject and execute and CommandBuffer with the new Render Graph API?
using UnityEngine.Rendering;
#if HAS_URP
using UnityEngine.Rendering.Universal;
using UnityEngine;
#endif
namespace UImGui.Renderer
{
#if HAS_URP
public class RenderImGui : ScriptableRendererFeature
{
private class CommandBufferPass : ScriptableRenderPass
{
public CommandBuffer commandBuffer;
public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
{
context.ExecuteCommandBuffer(commandBuffer);
}
}
[HideInInspector]
public Camera Camera;
public CommandBuffer CommandBuffer;
public RenderPassEvent RenderPassEvent = RenderPassEvent.AfterRenderingPostProcessing;
private CommandBufferPass _commandBufferPass;
public override void Create()
{
_commandBufferPass = new CommandBufferPass()
{
commandBuffer = CommandBuffer,
renderPassEvent = RenderPassEvent,
};
}
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
if (CommandBuffer == null) return;
if (Camera != renderingData.cameraData.camera) return;
_commandBufferPass.renderPassEvent = RenderPassEvent;
_commandBufferPass.commandBuffer = CommandBuffer;
renderer.EnqueuePass(_commandBufferPass);
}
}
#else
public class RenderImGui : UnityEngine.ScriptableObject
{
public CommandBuffer CommandBuffer;
}
#endif
}
I get some issue when first load Unity, then goes away, showing two errors.
Though this error can lead to a crash if not load a new scene to reset it in some cases, so
can be bit critical, even though does not generally affecting anything after if handled properly.
In another project this leads to a new scene that has no skybox and only shows a blur background,
this is the case where if i load a pipeline etc crashes.
Indeed, or just starting a new scene before do any other action solves it. But since can lead to crashes, would be great if was handled by Unity to not happen at all.
Iâm porting some some old renderpasses to the new RenderGraph API.
I have a render pass that sets the camera projection matrices by calling RenderingUtils.SetViewAndProjectionMatrices(CommandBuffer), then draws some meshes by calling CommandBuffer.DrawMesh, using Unityâs default materials (Lit, SimpleLit, Unlit).
I see that all samples use raster render passes (RenderGraph.AddRasterRenderPass) instead of regular passes (RenderGraph.AddRenderPass). I couldnât find any specific info about the difference between the two, but ChatGPT says that AddRasterRenderPass is the preferred and optimal way for regular drawing.
But a raster render pass uses RasterCommandBuffer instead of a regular CommandBuffer. RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) exists, but it is inaccessible.
What it does:
The ShaderPropertyId is also inaccessible. I can clone the implementation of that function, hardcoding the property ids, but I fear it is prone to break silently when something changes in Unity rendering pipeline.
I wonder what is the best way forward. Should I use a regular render pass, or reimplement RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) myself? Maybe Unity should make that method public.
TLDR; We made RenderingUtils.SetViewAndProjectionMatrices(RasterCommandBuffer) accessible in 6000.0.30f1.
Regarding your question about the API: RenderGraph.AddRenderPass is an older API only used internally by HDRP, it is not supported in URP and shouldnât be used in any URP ScriptableRendererFeature.
But I understand the confusion, we are planning to do a pass on the API documentation to make it more explicit. For now, just focus on RenderGraph.AddRasterRenderPass/AddComputePass/AddUnsafePass and the helpers RenderGraphUtils.AddBlitPass/AddCopyPass.