Check out other the post processing shaders. There are a lot of changes the HDRP makes in terms of how shaders are written, and what needs to be included by default. If you are including a .cginc file or ever calling tex2D you’re going to be having a bad time.
_CameraDepthNormalsTexture has likely been killed off in the HDRP in favor of something else with better accuracy. What that is and how to use it, I have no idea as I’ve not tried using the HDRP yet.
The lack of documentation is a pain right now, I’ll agree, but the SRPs still feel like they’re in a state of constant flux, so documentation is likely to be wrong by the time they write it.
I just tripped over this same problem and googling all day hasnt helped one bit.
Is there no porting guide or documentation for this kind of stuff over a year since the original post?
I just need to be able to access gbuffer data in my surface shader for a transparent pass shader applied to meshes, this is not something done in a post effect or even a UI type pass it needs to happen sooner in the rendering.
I’m able to get part of my effect working because I can still get depth information from _CameraDepthTexture
and i’m getting valid gbuffer diffuse data from _CameraGBufferTexture0 that has been written before the transparent rendering pass which is OK.
however
_CameraGBufferTexture1
_CameraGBufferTexture2
doesnt seem to have any data when i try to just try to debug draw it with say a cube using screenspace UVs during a transparent renderpass. I guess the normal data is not there because it has not been resolved because transparent pass geometry can write normal data still?
Ideally i need the equivalent of _CameraDepthNormalsTexture to function in HDRP that at least has the normals of all the opaque geometry in it. Then in the transparent pass I could sample that RT data so I can do some custom masking/blending between the normals already there, and what I calculate in the shader.
Do I need to capture the normal data in the gbuffer myself with command buffer and copy it to a seperate RT myself after the opaque geometry is done so? Is that even possible to do?
The code provided by the thread starter worked for me as soon as I included #include “Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl”
to resolve the “real3 cannot be found” error.