HDRP custom ImageEffect and _CameraDepthNormalsTexture??

I’m trying to implement a simple imageEffect shader in HDRP but having no luck accessing decoded normals from _CameraDepthNormalsTexture…

Actually what I really need is world space normals. I have also tried the code below with no luck!

#include "Packages/com.unity.render-pipelines.high-definition/Runtime/Material/NormalBuffer.hlsl"

// Load normal and roughness.
NormalData normalData;
DecodeFromNormalBuffer(positionSS, normalData);
float3 N = normalData.normalWS;
float roughness = normalData.perceptualRoughness;

Can Someone Please Show Me Clean Code To Render WorldSpace Normals in HDRP.

HDRP seems to have messed up lots of things…

There is zero clear documentation!! very frustrating when simple stuff is broken…

3 Likes

Come on UNITY this is such a simple task, why is there no answer??? HOW ON EARTH CAN YOU GET NORMALS IN HDRP?

incluging #include “Packages/com.unity.render-pipelines.high-definition/Runtime/Material/NormalBuffer.hlsl”

gives an error! real3 cannot be found?

Check out other the post processing shaders. There are a lot of changes the HDRP makes in terms of how shaders are written, and what needs to be included by default. If you are including a .cginc file or ever calling tex2D you’re going to be having a bad time.

_CameraDepthNormalsTexture has likely been killed off in the HDRP in favor of something else with better accuracy. What that is and how to use it, I have no idea as I’ve not tried using the HDRP yet.

The lack of documentation is a pain right now, I’ll agree, but the SRPs still feel like they’re in a state of constant flux, so documentation is likely to be wrong by the time they write it.

I just tripped over this same problem and googling all day hasnt helped one bit.
Is there no porting guide or documentation for this kind of stuff over a year since the original post?
I just need to be able to access gbuffer data in my surface shader for a transparent pass shader applied to meshes, this is not something done in a post effect or even a UI type pass it needs to happen sooner in the rendering.

I’m able to get part of my effect working because I can still get depth information from _CameraDepthTexture
and i’m getting valid gbuffer diffuse data from _CameraGBufferTexture0 that has been written before the transparent rendering pass which is OK.

however
_CameraGBufferTexture1
_CameraGBufferTexture2
doesnt seem to have any data when i try to just try to debug draw it with say a cube using screenspace UVs during a transparent renderpass. I guess the normal data is not there because it has not been resolved because transparent pass geometry can write normal data still?

Ideally i need the equivalent of _CameraDepthNormalsTexture to function in HDRP that at least has the normals of all the opaque geometry in it. Then in the transparent pass I could sample that RT data so I can do some custom masking/blending between the normals already there, and what I calculate in the shader.

Do I need to capture the normal data in the gbuffer myself with command buffer and copy it to a seperate RT myself after the opaque geometry is done so? Is that even possible to do?

Any answer to these ?

The code provided by the thread starter worked for me as soon as I included
#include “Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl”
to resolve the “real3 cannot be found” error.

I used it inside a custom post processing shader.