Struggling to draw UVs to screen with test shader in scriptable render feature

EDIT: Issue semi resolved, and culprit mostly determined, but I still don’t understand why. I’d appreciate if anybody who knows could explain why it wasn’t working. It’s all in the reply I just posted

I’ve used the shader graph a bit, but I’m completely new to writing shaders through scripting, so please bear with me if this is an obvious fix or if I’m fundamentally misunderstanding something.

I’ve gotten as far as getting my effect to render to the screen properly (I’ve already got a setup working for render passes and everything), my problem is just about getting the screen-space UVs of the shader to appear. This is what it looks like in the frame debugger:


The problem is, it shouldn’t be just blue. Here’s the shader code:

Shader "Hidden/TestSimpleShader"
{
    HLSLINCLUDE
        #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
        #include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
    
        float4 Test (Varyings input) : SV_Target
        {
            return float4(input.texcoord.x, input.texcoord.y, 1, 1);
        }
    
    ENDHLSL
    
    SubShader
    {
        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
        LOD 100
        Cull Off ZWrite Off ZTest Always
        Pass
        {
            Name "Simple Test"

            HLSLPROGRAM
            
            #pragma vertex Vert
            #pragma fragment Test
            
            ENDHLSL
        }
    }
}

As you can see, I’d expect the red and green channels to be the UVs of the screen, But no matter what brightness range I set it to, the channels are completely zeroed.

I’ve recreated this shader in the shader graph to make sure I wasn’t going insane:

And if I plug the shader graph into the shader slot on my render feature, it works exactly as I would expect:

So I really don’t understand what I’m doing wrong in the shader, and since my google searches and attempts to find documentation for this were repeatedly turning up nothing, I’m a little lost. The shader code is mostly a simplified version of the shader code here.

And to cover all my grounds, here’s the render feature/pass code. This isn’t compatibility mode; I’m using the render graph.

using UnityEngine;
using UnityEngine.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;

[System.Serializable]
public class TestSimpleShaderRenderFeature : ScriptableRendererFeature
{
    public Shader shader;
    private TestSimpleShaderPass pass;
    public override void Create()
    {
        pass = new(name);
    }
    protected override void Dispose(bool disposing)
    {
        pass.Dispose();
    }
    public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
    {
        if (renderingData.cameraData.cameraType == CameraType.Preview
            || renderingData.cameraData.cameraType == CameraType.Reflection
            || UniversalRenderer.IsOffscreenDepthTexture(ref renderingData.cameraData))
            return;

        if (shader == null) return;

        pass.renderPassEvent = RenderPassEvent.AfterRenderingTransparents;
        pass.SetupMembers(shader);
        pass.requiresIntermediateTexture = false;

        renderer.EnqueuePass(pass);
    }
    internal class TestSimpleShaderPass : ScriptableRenderPass
    {
        private Material mat;
        public TestSimpleShaderPass(string passName)
        {
            profilingSampler = new ProfilingSampler(passName);
        }
        public void SetupMembers(Shader shader)
        {
            mat = CoreUtils.CreateEngineMaterial(shader);
        }
        public void Dispose()
        {
            CoreUtils.Destroy(mat);
        }
        public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
        {
            UniversalResourceData resourcesData = frameData.Get<UniversalResourceData>();

            using (var builder = renderGraph.AddRasterRenderPass<PassData>("Test Shader Pass", out var passData, profilingSampler))
            {
                passData.material = mat;

                builder.SetRenderAttachment(resourcesData.activeColorTexture, 0, AccessFlags.Write);
                builder.SetRenderAttachmentDepth(resourcesData.activeDepthTexture, AccessFlags.Write);

                builder.SetRenderFunc((PassData data, RasterGraphContext rgContext) => rgContext.cmd.DrawProcedural(Matrix4x4.identity, data.material, 0, MeshTopology.Triangles, 3, 1));
            }
        }
        private class PassData
        {
            internal Material material;
        }
    }
}

And if it’s at all relevant, this is Unity 6000.7f1 with URP 17.0.3.
Any and all help is appreciated!

Alright, thanks to the help on the unity discord, I’ve somewhat nailed down what might be causing this.
If I modify my shader code so that, instead of using those #includes, I use the actual code copy/pasted from the Blit.hlsl file (and remove the DYNAMIC_SCALING_APPLY_SCALEBIAS function because I don’t know where it’s from or what it does), my code starts working.

Shader "Hidden/TestSimpleShader"
{
    HLSLINCLUDE
        #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Common.hlsl"
        //Attributes, Varyings and Vert were copied from Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl 
        struct Attributes
        {
            uint vertexID : SV_VertexID;
        };

        struct Varyings
        {
            float4 positionCS : SV_POSITION;
            float2 texcoord   : TEXCOORD0;
        };
        
        Varyings Vert(Attributes input)
        {
            Varyings output;
    
            float4 pos = GetFullScreenTriangleVertexPosition(input.vertexID);
            float2 uv  = GetFullScreenTriangleTexCoord(input.vertexID);

            output.positionCS = pos;
            output.texcoord   = uv; //This used to be DYNAMIC_SCALING_APPLY_SCALEBIAS(uv)

            return output;
        }
        float4 Test (Varyings input) : SV_Target
        {
            return float4(input.texcoord.x, input.texcoord.y, 1, 1);
        }
    
    ENDHLSL
    
    SubShader
    {
        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
        LOD 100
        Cull Off ZWrite Off ZTest Always
        Pass
        {
            Name "Simple Test"

            HLSLPROGRAM
            
            #pragma vertex Vert
            #pragma fragment Test
            
            ENDHLSL
        }
    }
}

Now the part that I don’t understand is why it wasn’t working for me, but it was the method they used in the official example supplied in the URP documentation. Does anybody know why it wouldn’t be working for me?