URP + ShaderGraph final color node?

I was thinking of putting together an asset for a custom fog gradient. It’s something I’m using in a personal project already, in my custom urp shader setup. But, I figured for a distributed version people would appreciate having ShaderGraph support to, so I’ve been digging into it.

What I need for this to happen is something like this:

  • output → my custom ‘final color modifier’ code → final color

Is this feature available and I’m just missing it? Or did the shadergraph team drop the ball here? Seems like such an obvious and very easy to add feature.

3 Likes

You can always write a custom pass (Scriptable render pass) and queue it after the post-process event. I doubt if they are going to add this functionality to the shader as it does not follow the graphics-pipeline…

You misunderstand what I’m asking for. This method would avoid a post process pass. It’s “post processing” the material itself, not the kind of post process you’re referring to where you draw a quad over the screen with a post process shader. This requires no pipeline reworking. It literally is just “ok after the lighting pass, here’s another node, do what you want with the final color”

2 Likes

For an example of what I mean, read the docs on Surface Shader’s "Final Color Modifier"

You would need to rebuild the shading your self, not sure this is possible with SG.
It is possible with Amplify though (which I would recommend over the sad joke that SG is anyday.)

I know it’s not possible right now, I’m requesting that it is supported as soon as possible. It seems like a very trivial thing to add, and it was a feature of the builtin pipeline’s surface shaders.

1 Like

@ThomasVFX can you chime in?

This would be awesome, but given the fact that basic features were not added in years to SG, I dought it will happen.

Amplify has the template system and you can create a new port for final color and how to blend that with you own additional post color.
I never tried it, but I would say it is possible and should be relatively easy to add.

Was this ever added (Can’t find it, so I guess not?)… Seems like a silly omission and not having it makes so many things really hard if not impossible to do. But I guess that just speaks to my pet peeve that “visual programming” sucks, because it only ever gets you half way there :slight_smile:

lol imagine unity giving us meaningful updates on literally anything

I second this. I’ve been asking for this for years. As-is, if you need to make final alterations to the color of the final output, you can only use an unlit graph. So to do this with lighting, you’d have to replicate all the lighting yourself in the graph. This is an important feature to enable things like custom fogging that attenuates everything, including emission, etc.

You can define a custom subtarget for shadergraph and then freely modify includes and passes in the generated shader file.
https://discussions.unity.com/t/920123

(It would be dramatically less hacky if some of the methods were not internal)

I hadn’t seen that, and still don’t as I currently have to be on 2021 (URP/Shadergraph 12.1.12). In what version is that available?

But yeah, it would be WAY simpler if there were just a “final color” (and “final Z”?) node. We simply don’t have full control over our shaders until we have this.

Another nice solution would be to just have an in/out PBR node, so we could use ‘unlit’ nodes more easily.

3 Likes

It isn’t really ‘available’ in that you need to embed the package and expose the internals, but you could do that in 2021 also

The ‘final color’ would just be an unlit shader. I agree with above a PBR node would be lovely, but I think it’s never going to happen

Yes, basically what I’m describing is having an intermediary lighting/PBR node that has a final output that feeds into an actual terminus that just accepts the final color, just like an unlit shader graph does. I don’t really understand why the only approach taken is for all the lighting to be done in at the very end of the graph so that you can’t do anything with the final output.

In the meantime though, if your modification is minor -

The fragment shader for the pbr pass is defined here and inserted into the generated shader through an include added by the Lit SubTarget.

You could make your own SubTarget, change the pragma to use a different name for the fragment shader, so you can use your own custom function instead. In your function you could wrap the call to frag and intercept the output. It’s far from your ideal solution but it might be useful

1 Like

You could use a custom function node and just call UniversalFragmentPBR (Function is called something like that) directly, get the output color, and do whatever you want with it afterwards in an unlit graph. Until Unity adds custom lighting nodes, which is on the roadmap (and likely coming with SG2), we have to use a bit of shader code for the task unfortunately.

1 Like

I didn’t know about this. I had searched for something like this, but for some silly reason “PBR” wasn’t on the list of key search terms I thought of looking for, so I never found it. I haven’t tried messing with it yet, but this looks to be exactly what I was looking for–a node to handle the lighting calculations, but that outputs the results for further processing rather than being the terminal node.

So I tried a custom function node with the following:

void PBR_float(float3 positionWS, half3 normalWS, half3 normalTS, half3 viewDirectionWS, half3 bakedGI, half3 albedo,
        half metallic, half3 specular, half smoothness, half occlusion, half3 emission, half alpha, half clearCoatMask, half clearCoatSmoothness, out float3 Color)
    {
    #if defined(SHADERGRAPH_PREVIEW)
        Color = float3(1, 1, 1);
    #else
        InputData inputData;
        inputData.positionWS = positionWS;
        inputData.normalWS = NormalizeNormalPerPixel(normalWS);
        inputData.viewDirectionWS = SafeNormalize(-viewDirectionWS);
        inputData.shadowCoord = half4(0, 0, 0, 0);
        inputData.fogCoord = 0;
        inputData.vertexLighting = half3(0, 0, 0);
        inputData.normalizedScreenSpaceUV = half2(0, 0);
        inputData.shadowMask = half4(0, 0, 0, 0);
        inputData.bakedGI = bakedGI;
       
        SurfaceData surfData;
        surfData.albedo = albedo;
        surfData.specular = specular;
        surfData.metallic = metallic;
        surfData.smoothness = smoothness;
        surfData.normalTS = normalTS;
        surfData.emission = emission;
        surfData.occlusion = occlusion;
        surfData.alpha = alpha;
        surfData.clearCoatMask = clearCoatMask;
        surfData.clearCoatSmoothness = clearCoatSmoothness;
       
        Color = UniversalFragmentPBR(inputData, surfData);
    #endif
    }

But I get weird results. The shadergraph preview output is all black no matter what (unless I set emission color to something above black, so it seems emission is the only bit that’s working). And the actual object in the scene that uses it is all black with random flickering white pixels. Super weird.

If I bypass this node and route all my other inputs directly to the output fragment node, it renders correctly (as correctly as you’d expect without lighting information). That is, textured, colored, not all black, etc.

Does anyone have an example lit node they can share so I can compare?