I’d like to save the PBS deferred G-Buffers to disk, I want to use the resulting textures in PBS materials. Is this possible without writing a bunch of replacement shaders that have to somehow figure out how each visible shader affects albedo/spec/etc.?
This only ever has to work in the editor, so C# reflection based methods would work just fine.
I’ve started exploring methods using command buffers, but I haven’t found a way to extract the G-buffers to Texture2Ds (or external render textures) yet.
Awesome. In my testing the gbuffer is only accessible in the Overlay queue or screen space effects. Is there a reason they are not available sooner? I would think these would just accumulate as each material is rendered, and that any material could read the values from previous material, assuming no batching.
They should be accessible in the transparent queue as well. When rendering opaque objects the gbuffers are the current render targets, and you cannot read from the target texture you’re writing to (* in most graphics APIs), so those global textures aren’t set until after all opaques (or more specifically all objects that render to the gbuffer) have been rendered.
Ok, good to know. I’ve always found it frustrating not being able to read from the framebuffer or at the very least to plugin a custom blend function, but I guess that’s just how GPU’s work.
Actually, you can do a custom blend function since Unity 5.3 introduced a special attribute called finalgbuffer allowing you to declare a function controlling how your outputs are written to 4 deferred render targets. Google it, it works similarly to frag, vert and surf attributes - you write the name of your function after it, then create a function with a special signature. Which is great, because retrieving stuff from those render targets through samplers doesn’t work for custom blending in surface shaders (you can’t simultaneously sample and write to the same texture reliably). Surprisingly, I don’t know a single example of it’s use in Unity stock shaders and in community shaders, but I was able to utilize it to make a deferred decal shader. There are some quirks making blending inconvenient: e.g. Unity packs smoothness into alpha of specular, which makes it impossible to properly blend specular without nuking smoothness into value of 1 - I used separate alpha blending mode and a secondary pass to get around that (another example is emission containing ambient response dependent on albedo, which is problematic if you want albedo-only or normal-only overrides). But overall it works great.
How can I access GBuffer textures inside Shader Graph? When I put the reference to _CameraGBufferTexture2 I get only a black texture.
I am using HDRP, but Lit shader is set to deferred only.
The above thread is only relevant to the built in deferred rendering path, not any SRP.
If you’re using the HDRP then the textures are going to be called _GBufferTexture0, _GBufferTexture1, etc. But I don’t know if it’s possible to access them from Shader Graph. It’s certainly not possible if you’re trying to access those textures from an opaque shader. You can try creating an unexposed texture property that has its reference name set to _GBufferTexture1 and sample that.