HDRP Post Processing shaders - accessing GBuffers

It will probably be a long post so for starters here is some sort tl;dr:

So far I was able to fork HDRP and plug it back to unity, add some code to embedded post processing system to do some sort of shader work as in FinalPass.

I’ve created new material using


And then, after all of the built-in post processing (but before FinalPass call) I am using it with:

HDUtils.DrawFullScreen(cmd, m_OutlineMaterial, destination);

Shader seems to work just. I am able to access screen depth using LoadCameraDepth(input.vertex.xy);

Here comes the part that I don’t understand at all. In my shader, that is working fine in 2018.3 (hdrp 4.X) I’ve got something like this:

TEXTURE2D_SAMPLER2D(_MainTex, sampler_MainTex);
TEXTURE2D_SAMPLER2D(_GBufferTexture0, sampler_GBufferTexture0);
TEXTURE2D_SAMPLER2D(_GBufferTexture1, sampler_GBufferTexture1);
TEXTURE2D_SAMPLER2D(_GBufferTexture2, sampler_GBufferTexture2);
TEXTURE2D_SAMPLER2D(_GBufferTexture3, sampler_GBufferTexture3);
TEXTURE2D_SAMPLER2D(_CameraDepthTexture, sampler_CameraDepthTexture);
TEXTURE2D_SAMPLER2D(_NormalBufferTexture, sampler_NormalBufferTexture);

And all of those were accessible. By reading some source code of current hdrp I’ve noticed that some of macros has changed and now I have to declare textures and samplers separately with TEXTURE2D_X and SAMPLER. Despite that no matter how I would try to access gbuffer I am getting unexpected results - as grey output for example.

In Lit.hlsl there is something about accessing textures in gbuffer but when I am trying to do the same in my shader I’ve got compilation errors with missing functions/structs no matter what sources I include.

I’ve also searched answers in Debug sources, especially in DebugFullScreen.shader. There was something like this:

float depth = LoadCameraDepth(input.positionCS.xy);
PositionInputs posInput = GetPositionInput(input.positionCS.xy, _ScreenSize.zw, depth, UNITY_MATRIX_I_VP, UNITY_MATRIX_V);

BSDFData bsdfData;
BuiltinData builtinData;
DECODE_FROM_GBUFFER(posInput.positionSS, UINT_MAX, bsdfData, builtinData);

But no matter what sources I include (even the same as in shader source in correct order) I am getting error that BSDFData cannot be resolved.


  1. Am I even on the correct path to implement my custom post processing effects by using HDUtils.DrawFullScreen() method?
  2. Where can I find info about macros in hdrp? For example I would like to know what are the differences between TEXTURE2D and TEXTURE2D_X or what does LOAD_TEXTURE2D_X do and so on? Currently I am just making some not to good assumptions and work mainly by doing mistakes and learning from them.
  3. Why can’t I use BSDFData and what it contains?
  4. What should I do to be able to access all of the gbuffer textures? Is it something wrong with the way that I am attempting to do my post process effect?

It would be awesome if someone could shed some light on my issue. I am very new to shaders at all and how event hdrp works.

1 Like

HDRP is getting custom pass soon: https://github.com/Unity-Technologies/ScriptableRenderPipeline/pull/4317

1 Like


it is not DebugFullScreen.sahder that you should look at but DebugViewMaterialGBuffer.shader. This is a self content shader that access to the GBuffer. When accessing the GBuffer you need to use the Lit.hlsl file, as GBuffer is only define with our lit shader (the Lit.hlsl contain surface and bsdf data). See the list of include in this shader, you should include the same list.
you can look at how DebugViewMaterialGBuffer is call in C#

The TEXTURE2D_X is use to be compatible with single instancing VR path. All our render target texture are texture array to deal with multiple eye and the _X is a macro to expand to an array if the platform support VR. In practice, if you want to have your postprocess compatible with VR you must use these macro.

Note that we don't recommend to access to GBuffer, as people can switch to forward renderer and so don't have GBuffer in this case (all depends what feature you want to support) and this doesn't support our forward only material (hair, fabric, stacklit...). However you can access to what we call NormalData: Depth, normal and smoothness. This also work in forward and with forward material. You have an example in ScreenSpaceReflections.compute file

hope this help.


This seems like an awesome feature, but won’t it be available only in HDRP 7.X making it only compatible with unity 2019.3 cycle?

It sure helps, but with your advice I’ve encountered the problem that i’ve faced some time before. When I include the same list as in DebugViewMaterialGBuffer.shader I am getting an error that I don’t quite understand:

Shader error in 'Hidden/ShadowOfTheRoad/Outline': undeclared identifier 'FastLog2' at /Users/xxx/Projects/Unity/_packages/ExtendedScriptableRenderPipeline/com.unity.render-pipelines.high-definition/Runtime/Material/SubsurfaceScattering/SubsurfaceScattering.hlsl(171) (on d3d11)

Compiling Vertex program

The include list in my shader is exactly as in DebugViewMaterialGBuffer.shader, order of includes is exactly the same also.

My whole shader code is as follows:


Shader "Hidden/ShadowOfTheRoad/Outline"
        Tags{ "RenderPipeline" = "HDRenderPipeline" }

            ZWrite On ZTest Always Blend Off Cull Off

                #pragma vertex Vert
                #pragma fragment Frag

                #include "Outline.hlsl"
    Fallback Off


#pragma target 4.5
#pragma only_renderers d3d11 ps4 xboxone vulkan metal switch

#include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"

#include "Packages/com.unity.render-pipelines.high-definition/Runtime/ShaderLibrary/ShaderVariables.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/Material/Material.hlsl"

#include "Packages/com.unity.render-pipelines.high-definition/Runtime/Debug/DebugDisplay.hlsl"
#include "Packages/com.unity.render-pipelines.high-definition/Runtime/Material/Lit/Lit.hlsl"

float4 _outlineColor;

struct Attributes
    uint vertexID : SV_VertexID;

struct Varyings
    float4 positionCS : SV_POSITION;
    float2 texcoord : TEXCOORD0;

Varyings Vert(Attributes input)
    Varyings output;
    output.positionCS = GetFullScreenTriangleVertexPosition(input.vertexID);
    output.texcoord = GetFullScreenTriangleTexCoord(input.vertexID);
    return output;

float3 Frag(Varyings input) : SV_Target
    float depth = LoadCameraDepth(input.positionCS.xy);
    PositionInputs posInput = GetPositionInput(input.positionCS.xy, _ScreenSize.zw, depth, UNITY_MATRIX_I_VP, UNITY_MATRIX_V);

    BSDFData bsdfData;
    BuiltinData builtinData;
    DECODE_FROM_GBUFFER(posInput.positionSS, UINT_MAX, bsdfData, builtinData);

    float depth = LoadCameraDepth(input.positionCS.xy);
    float3 color = _outlineColor * depth;

    return color;

Am I missing something obvious?

1 Like


just random guess but IIRC the #pragma directive don’t work in include, you need to move this part in .shader

  • #pragma target 4.5
  • #pragma only_renderers d3d11 ps4 xboxone vulkan metal switch

I guess it is a godd candidate to your failure as the code that fail is:

#if (SHADER_TARGET >= 45)
uint FastLog2(uint x)
return firstbithigh(x);

So I guess the shader compiler don’t find any directive #pragma target 4.5 because it is in an include.

1 Like

Awesome, you are exactly right. I would struggle much more if it wasn’t for you :slight_smile: Thanks - now I will try to access the data I need and hopefully recreate my effects.

But I won’t hesitate to ask if I encounter another problems :slight_smile:

What if i make a custom post process for a non-VR game. Do i have to make 2DArrays for single textures?

I’ll try asking here since it’s probably the closest thread I found to what I’m looking for. I really need to be able to read diffuse buffer for the post-process effect I’m working on - screen space GI (+AO). I need to be able to bounce light even at surfaces that are fully shadowed/black. And to properly light them up, I need to know their (diffuse) color. Sadly I have no luck accessing it, the buffer is fully black and I have no idea why, what else should I include or how else to sample it.
I’ve had some luck with this: “TEXTURE2D_X(_GBufferTexture0); + LOAD_TEXTURE2D_X(_GBufferTexture0, positionSS);” , but only when reading _GBufferTexture1.

I have also tried using macros I found in the DebugViewMaterialGBuffer.shader and/or Lit.hlsl you talked about here

float4 GetDiffuseColor(float2 positionSS) {
    BSDFData bsdfData;
    BuiltinData builtinData;
    DECODE_FROM_GBUFFER(positionSS, UINT_MAX, bsdfData, builtinData);
    return float4(GetDiffuseOrDefaultColor(bsdfData, 1).rgb, 1);

But same result → all black. So I’d like to know, how can I actually sample diffuse buffer in HDRP PostProcess when using Deferred shading? Currently I can only work with the colors from source RT, which is fine for well lit scenes but not when trying to light up dark areas, there you lose a lot of color info, making the effect very dull and ineffective.

Test scene (dark areas are severely lacking):

Currently I can only achieve the middle image, which does not really make it look like the light is actually bouncing and lighting other surfaces. The 3rd image is “artifically boosted” to resemble what I think it should approximately look like, but for now I’m just adding a bit of white to any sampled color so even black surfaces show up. Wouldn’t make sense for example to fully shadowed green surface, that would look like the original color is dark grey :confused:

I’d like the middle variant to be a fallback in case of forward rendering possibly, why not, just sample the source. But when available (= deferred), I really need to know the diffuse color that should be affected. Any help would be really appreciated, I am stuck for 3 days just on getting this to work really.