One is a custom pass to draw all the glass windows. It reads HDSceneColor to create a blurred effect on the glass surface (it is in a custom pass so that it can also blur the transparent water previously drawn - long story that I don’t want to get into here).
Another is our UI, which is in camera space, so that we can use a custom shader that again draws a blurred background using HDSceneColor.
Any shader in a custom pass that samples from HDSceneColor, now reads bogus information for 75% of the screen. It’s like the HDSceneColor buffer being read from is at the 50% scale, but the current shader is seemingly operating at full resolution.
I can’t upload more images - but basically… same issue in the UI.
Furthermore - the UI “Tiles” shown before, now also jitter like crazy:
So after a long and tedious wall of text, I guess my first question is… can DLSS play well alongside custom passes using HDSceneColor (shader graph node)?
Secondly, can screen space UI be used at all without causing this crazy jitter?
I believe I’ve seen this somewhere in the issue tracker, as a bug.
But I’m not sure, it’s been a while and it might’ve been a different issue so I can only hope so as it sounds like a critical issue.
It’s nothing as fancy as hand-written shaders or access to parameters such a “_ScreenSize”.
Here’s what we have… a pretty simple custom pass with Before Post Process injection point:
And then a Shader Graph that simply reads from HD Scene Color using screen co-ords. The screen co-ordinates are the issue, since they seem to be in full res, but the HD scene color buffer seems to be at 50% res?
But the reason this is a little baffling, is because:
DLSS injection is set to AFTER POST PROCESS - this is when the final upscale should be happening in my understanding.
Custom pass injection point is set to BEFORE POST PROCESS, so all buffers should be in the 50% dimensions.
So I guess “Screen Pos” is just always seen as full res… so how would one read from HD Scene Color in a manner that is agnostic to the DLSS percentage?
NOTE - even reading HD Scene Color using default UV (no input), the issue is the same. So it seems the HD Scene Color node does not have a correct understanding of DLSS scaling.
Anyway, this still leaves the insane Jittering shown in my original post in the video, which I will dig into separately to try to find a solution.
when using a blurred sample you have to be even more carefully as here you will sample from lower mip levels of the color pyramid which will contain white spill at the upper and left edge:
I think in v2022.3.8, the use of HD Scene Color seems to work OK in a custom pass. I will have to check again to confirm this, because I only recently switched back to a custom pass for our glass, and I didn’t notice an issue.
But I also use FSR2, and this still has the issue for sure.
So the only way I know to solve it, is to manually adjust the HD Scene Color input UV by the scaling factor that is currently active. This is on my to-do list, fixing the FSR2 incorrect sampling of scene color, so I will report back here as soon as I can.
Hi there, i think i found the problem. I have same problem with you but in my case i’m using custom post process fullscreen for Raindrop Effect. The problem is when you get from screen position is identifying from input.texcoord and use HD Scene Color or HD Sampler Buffer Mode PostProcessInput which is it’s not from the resolution screen. Idk, it’s kinda bug from unity or what because i’m injected DLSS Before PostProcess same like you. But, after i read several docs and trying to found the problem i got my own custom Custom Function node that replace HD Scene Color/ HD Sampler Buffer Mode: PostProcessInput to my own node like this
here what you should do:
Create file like txt named “DynamicRescale” on desktop or anywhere else
Copy this code
// UNITY_SHADER_NO_UPGRADE
#ifndef DYNAMICRESCALE_INCLUDED
#define DYNAMICRESCALE_INCLUDED
#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ShaderVariables.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/PostProcessing/Shaders/RTUpscale.hlsl"
TEXTURE2D_X(_CustomPostProcessInput);
// Input:
// ScreenPos : float2 from node Screen Position (Default) Shader Graph
// Output:
// ColorOut : float4 UV that already corrected if DLSS/FSR active
void DynamicRescaleUV_float(float2 UVsIn, out float4 ColorOut)
{
// Use HDRP’s canonical scaling for RTHandles (DLSS/FSR/dynamic res aware)
float2 uvSample = UVsIn * _RTHandlePostProcessScale.xy;
// Use HDRP’s default linear clamp sampler already defined in includes
ColorOut = SAMPLE_TEXTURE2D_X_LOD(_CustomPostProcessInput, s_linear_clamp_sampler, uvSample, 0);
}
#endif // DYNAMICRESCALE_INCLUDED
change the extension from “.txt” to “.hlsl”
Import this file to you project (I suggest you make a duplicate ur shader for safety)
Open your shader graph then search “Custom Function”
Set type to “File” and Name to “DynamicRescaleUV”
Add Input Vector2 named “UVsIn” and Add Output Vector4 “ColorOut”
Connect the screen position node mode default to UVsIn and ColorOut can be modified by anything such as HD Sampler Buffer/HD Scene Color
I’m using Unity 6.2 HDRP 17.2 btw but still have same problem like you. I hope this one can solve your problem same as mine!