Support custom post processing for HDRP + URP with shader graph

Hey,

Do you plan to support custom post processing for HDRP + URP with shader graph ? Amplify shader is planning to do so.

Custom post process is a big deal for HDRP. As you don’t have custom lightning in it, a very good way to create advanced stylised aesthetics is to use a custom post process, exactly like you would do in Unreal Engine :

Video

I mean, there is a huge potential with HDRP to create more stylized stuffs. Just take a look at what this person made with UE4 visual scripting post processing graph :

Thanks :slight_smile:

1 Like

Oh well, it’s on the roadmap, this is cool :slight_smile: : https://portal.productboard.com/8ufdwj59ehtmsvxenjumxo82/tabs/7-shader-graph

Could someone confirm this will be for both HDRP and URP though ?

i feel you

Custom post-processing is possible in Shader Graph in both HDRP and URP. Here’s a series of videos that I created showing how to do it:

1 Like

@BenCloward I have a question about this (just started watching). In 2023.3 we have WebGPU support, which doesn’t have synchronous readback (i.e. no ReadPixels). In your video you have 1 pass writing and one pass reading - is this doing same thing behind the scenes or is it unrelated? In other words, will this still work in WebGPU ? Thanks! :slight_smile:

Okay so the approach in 2023.3/Unity 6 is a bit different, but I got it working (using URP… haven’t tried this with any other render pipeline) and here’s how. This is intended as supplementary info to Ben’s video above, because everything except a few key points are the same.

Firstly, it’s now done as a Render Feature. So you open the inspector to your Renderer and create a new Full Screen Pass. Make sure it’s the renderer you’re actually using because projects by default have multiple included in quality settings (so, either add your Render Feature to all 3 renderers, or to simplify a test project, you could go to Build Settings->Quality then delete them all except Balanced by clicking trash can icon, then just add your Render Feature to inspector of “URP-Balanced-Renderer” asset in Project window).
9560758--1351828--adding_the_render_feature.png

Then if you hover the “Fetch Color Buffer” it will tell you that the result will be in _BlitTexture:
9560758--1351837--fetch.png

In your Shader Graph, you add a Texture 2D and call it _BlitTexture and set Scope to Global (see top right of image below). Sample this instead of the “HD Scene Color” shown in video, and you’re set!

Note that the above gets you to roughly 11:19 in the video and that is already a functional full screen shader graph - from there you can follow the rest of the video without issue.

I also confirmed that it does also work in WebGPU! :slight_smile:

@BenCloward can you clarify about 2023.3 URP? In your video at 1:05 it shows this,

but in the editor there is no Custom Pass under Volume for me,

there is also now apparently no “Custom Color Buffer” node in Shader Graph? i feel like i’m missing something or doing something wrong, or things really have changed quite a bit since the version you were using…

anyway, in the process of figuring out how to get it working without the 2 passes thing, i did the “Fetch Color Buffer” checkbox and used _BlitTexture, but actually URP also has Scene Color node (i guess URP version of the HD Scene Color in your video… or renamed since then?) and previously I’ve assumed this means the Opaque texture of the URP Asset - so without using the Fetch Color Buffer I got it working like this, which is closer to your video.
9563620--1352530--c.png

In the editor I don’t seem to need to have the Opaque checkbox ticked in the URP Asset, but iirc it needs to be enabled when this runs in a build (been a while since I tinkered with anything related)? Does the Scene Color node just sample “_CameraOpaqueTexture”?
9563620--1352542--d.png

Anyway, I’m assuming doing Scene Color is better for performance, but wondering… is the 2 passes thing something different I’m not clued into, and I should be using the Fetch Color Buffer and _BlitTexture instead? Can you offer any clarification on this stuff? Thanks.

The reason you’re not seeing those features is because they’re specific to HDRP. I’m showing everything in HDRP and there are several differences with the way it’s set up vs URP. Documentation for the URP-specific way of setting it up can be found here: How to create a low-code custom post-processing effect | Universal RP | 17.0.3

Ah okay, my mistake - was thrown off by the “both HDRP and URP”, I thought that meant the video was for URP too (I’m guessing you clarified it wasn’t somewhere I skipped over without noticing). Anyway I still got it working, and as mentioned previously elsewhere I’ve learned much over time from your videos, so thanks again for doing those :slight_smile:

1 Like

How does the editor reach the point where this is even possible…? You would think the ShaderGraph team would be required to make all new features work with all render pipelines…

As @BenCloward mentioned, post-processing is setup differently in URP and HDRP, but Shader Graph allows you to author custom post-processing effects for both.

To your point @funkyCoty , improving pipeline agnostic content creation is a high priority for us, and Shader Graph is already a key in enabling cross-pipeline content be made.
In some areas, like post-processing, the differences in implementation between URP and HDRP may not allow yet for Shader Graph to let you author all effects as a “one size fits all”. Although, rest assured we’re working across the board to improve on the workflows.

Good video!
I am using Unreal exclusively for the past 2 weeks and find all shader work to be much simpler. For example a post process is just any shader where the output node is to the screen buffer. Now I’m wondering, why does Unity need 2 shaders and why is there a read screen node per buffer? Seems that Unreal read scene node with a popup menu that gives you a list of all available buffers is better as it allows for swap-and-try.

Then take URP’s, make it the standard in the base scriptable render pipeline. Make HDRP use it. Have Shadergraph only care about that ONE implementation. You guys are shooting yourself in the foot with different implementations for all of these core features.

1 Like