Hi. So I’m still just very bad at shaders. Long story short, I need a shader for a shockwave effect over top the entire scene. i’m trying to follow this tutorial here, which led to this video here, which pointed to this link. I tried to follow along as best as I could but I feel like the information is either way over my head, or outdated or something. I can’t seem to get a full screen shader to work following these instructions. I think my main problem is after I’m supposed to add the blit. When I do, I get an error in the console that reads "
You can only call cameraColorTarget inside the scope of a ScriptableRenderPass. Otherwise the pipeline camera target texture might have not been created or might have already been disposed." I don’t know what this means. I don’t see anyone else who needs full screen shaders or anyone following this tutorial receiving this error and having this issue, so I must be doing something wrong. I tried to find some information outside of this guys tutorial. The most intruguing thing I found was this github repository, which, if you scroll near the bottom of it, has a graph called impact that is very similar to the shockwave shadergraph I want. I was eager to try it out, but it turns out these shaders dont work anymore since they were written in an old version of the URP or something like that. So I’m wondering, after all this mindless running around and not accomplishing anything, is there a better way to get full screen shaders working? Are there some other resources or tutorials out there, preferably from somewhere other than this gamedevbill guy, that covers how to get full screen shaders in the URP working, and the information is up to date and ACTUALLY works?
Are you using the 2D Renderer? If so, there’s this example here
Look for the Heat Haze Overlay scene. It showcases a shockwave-like effect - heat distortion. It uses the _CameraSortingLayerTexture.
See this post for some details on how to get it up and running:
I was able to get full screen shaders by setting my project up in a similar way to theirs. Thanks!
Just in case anyone tips over this as well, I ran into the same issue following the same tutorial, i.e., You can only call cameraColorTarget inside the scope of a ScriptableRenderPass. Otherwise the pipeline camera target texture might have not been created or might have already been disposed.
The issue was that I had multiple cameras in the scene using the same URP ForwardRenderer, one was the main Base camera, and another stacked overlay camera for UI.
The solution was to have to make a copy of the ForwardRenderer, so have one ForwardRenderer_Main & another ForwardRenderer_UI. Add both to the parent Renderer, you will see a ‘+’ to add another ForwardRenderer.
Then you can add Blit to the ForwardRenderer of choice. Set the main camera to use ForwardRenderer_Main and the UI camera to use ForwardRenderer_UI. This means the shockwave will only appear on the main scene and the UI will be unaffected. Hope this helps.
Since I was directed here via a web search engine, I’m going to do a favor for everyone who took the same route looking for an answer to the same question:
This is Unity’s URP 16 documentation for custom post-processing effects
A short summary of the steps involved in case the link above breaks:
Create your Shader
-
Create > Shader Graph > URP > Fullscreen Shader Graph.
-
Add a URP Sample Buffer node. (this will grab the screen texture to be processed by your shader)
-
In the URP Sample Buffer node’s Source Buffer dropdown menu, select BlitSource
-
Do whatever you want to the input and send it to the fragment Base Color output (image from Unity’s docs)
-
Save your graph.
Applying the Effect
- Create a Material and assign your Shader to it
- Add a Full Screen Pass Renderer Feature to the URP Renderer you want to apply your shader to
- Assign the material to the Full Screen Pass Renderer Feature’s “Post Process Material”
- Set Injection Point to After Rendering Post Processing.
- Set Requirements to Color.
BONUS ROUND: for people who prefer to create their shaders directly in HLSL code
Set up a pass-through shader like so:
Add all the parameters you need:
Save and select the shader graph.
View the Generated Shader.
Grab everything and copy-paste it to a new shader.
Edit that new shader.
Rename it so that it doesn’t get lost in the Shader Graphs menu.
Find:SurfaceDescription SurfaceDescriptionFunction(SurfaceDescriptionInputs IN)
Replace with this and edit it as necessary.
// =================================================================
// Frag Out
// =================================================================
SurfaceDescription SurfaceDescriptionFunction(SurfaceDescriptionInputs IN)
{
SurfaceDescription surface;
const float2 inputUvs = IN.NDCPosition.xy;
const float4 inputColor = Unity_Universal_SampleBuffer_BlitSource_float(inputUvs);
// Insert your own code, modifying inputColor
float4 outputColor = float4(inputColor.r, inputColor.g, inputColor.b, 1);
surface.BaseColor = outputColor.xyz;
surface.Alpha = 1;
return surface;
}
YOU AREN’T DONE YET!
There is ANOTHER instance of the SurfaceDescriptionFunction. You must replace that one as well with your edited code. In total, there are TWO instances of that function, one in the Blit pass and one in the DrawProcedural Pass. BOTH must be replaced.
If this didn’t work for you, post your updated solution below.
Bonus Addendum: Blittexture access
By default, shaders generated from a graph will contain the following function used to read the full-screen texture for processing.
float4 Unity_Universal_SampleBuffer_BlitSource_float(const float2 uv)
{
uint2 pixelCoords = uint2(uv * _ScreenSize.xy);
return LOAD_TEXTURE2D_X_LOD(_BlitTexture, pixelCoords, 0);
}
This shader will have some shortcomings when it comes to graphical pipelines that involve upscaling/downscaling of image buffers, often creating unsightly aliasing artifacts on shapes or small text.
This is because it is turning float2 uv into uint2 pixelCoords, in order to satisfy LOAD_TEXTURE2D_X_LOD’s requirement for uint2 coordinates, and this promptly introduces a rounding bug.
wibbly wibbly wobbly result:
You can solve this issue by keeping the uv as floats by using a function that takes uv floats to read your texture instead:
float4 Unity_Universal_SampleBuffer_BlitSource_float(const float2 uv)
{
return SAMPLE_TEXTURE2D(_BlitTexture, SamplerState_Trilinear_Repeat, uv);
}
Add this under the CBUFFER_END block with all your shader parameters
SAMPLER(SamplerState_Trilinear_Repeat)
like so:
result:
“Add a Full Screen Pass Renderer Feature to the URP Renderer”
how?
thanks
lemme know if I missed anything
How would I go about getting access to the complete blittexture (in shadergraph) and passing that as an input to a custom function?
Step 1: Create a node accessing the blitttexture in the shadergraph:
- Right-click an empty spot on the canvas:
Step 2: Create your custom function node:
-
Right-click an empty spot on the canvas:
Sorry for the late reply.
And in case you need a link to the Custom Function Node docs…
Must be some missing include:
SamplerState_Trilinear_Repeat is a not found variable.
EDIT:
Add the line:
SAMPLER(SamplerState_Trilinear_Repeat);
before the function call
Good catch, added this to the instructions
hi. Is there a way to add properties to the full screen shader? It seems adding a property block to full screen shader breaks the shader graph.
What I’m showing in the “Bonus Round” is how to convert a shader graph to a shader, which involves the use of the View Generated Shader function of a shader graph to output code that is pasted into a .shader file to turn it into a shader.
Shader Graphs aren’t supposed to have property blocks like shaders are, which might be why adding a property block to a shader graph is breaking it.
If you just want to use the shader graph and don’t need direct access to the HLSL code, you can skip the Bonus Round!
Now, you can’t add properties to a shader graph because they’re actually called parameters.
There’s a button with a “+” that shows a dropdown menu to add parameters on the left side.
Just select the type of parameter you need, name it…
And then you can just drag it onto the canvas.
Connect it… mess with any settings for this parameter in the Graph Inspector…
Save your graph… Make a material… Assign the graph to the material…
And your parameter should show up on the material.
If I View Generated Shader (you don’t have to do this step!) it automatically adds the custom parameter to the properties block.
Hope that answers your question.
Are they ever gonna release actual HLSL support for Fullscreen shaders? I can’t stand these ugly Shader Graphs and trying these workarounds doesn’t feel any better…
You could always do this. Just write a render pass and blit with whatever custom shader you want. (the api for this changes a lot between URP versions which is definitely annoying though)
There’s an example full screen shader (supports the new Blitter API) in the URP documentation page.