Problem with Custom Render Texture Shader Graph in Built-In Render Pipeline

I made a shader graph to dilate a custom render texture, but when using it with the built-in render pipeline it doesn’t quite work anymore.

Here’s the graph.

This is the code from the custom node.

void UVDilate_float(UnityTexture2D Tex, UnitySamplerState Sampler, float2 TexelSize, float2 UV, float MaxSteps, out float4 Out)
{
    float2 texelsize = TexelSize;
    float2 offsets[8] = { float2(-1, 0), float2(1, 0), float2(0, 1), float2(0, -1), float2(-1, 1), float2(1, 1), float2(1, -1), float2(-1, -1) };

    float4 sample = SAMPLE_TEXTURE2D_LOD(Tex, Sampler, UV, 0);
    Out = sample;
    
    bool Break = false;
        
    int i = 0;
    
    while (i < MaxSteps)
    {
        i++;
        
        int j = 0;
        while (j < 8)
        {
            float2 curUV = UV + offsets[j] * texelsize * i;
            float4 offsetsample = SAMPLE_TEXTURE2D_LOD(Tex, Sampler, curUV, 0);

            if ( !offsetsample.a < 1.0)
            {
                Out = offsetsample;
                Break = true;
                break;
            }
                
            j++;
        }  
        
        if (Break)
            break;

    }
}

And this is what I’m doing to invoke the dilation.

/// <summary>
/// Method used to dilate a texture based on its alpha channel.
/// </summary>
/// <param name="tex">The original texture.</param>
/// <param name="steps">Dilation distance in pixels.</param>
/// <returns>A dilated version of the original texture as a CustomRenderTexture </returns>
public CustomRenderTexture DilateTexture(RenderTexture tex, int steps = 16)
{
    var RT = new CustomRenderTexture(tex.width, tex.height, tex.graphicsFormat);
    RT.material = new Material(dilationShader);
    RT.material.SetTexture("_MainTex", tex);
    RT.material.SetFloat("_MaxSteps", steps);
    RT.updateMode = CustomRenderTextureUpdateMode.OnDemand;
    RT.Update();

    return RT;
}

I’m having trouble getting to the root of the issue, since the texture is obviously being sampled (see replies), just not dilated.

Any help is appreciated.

Expecterd result in URP.

Observed result in Built-In.

For future reference.
After spending a bit more time on debugging I found the error. The problem wasn’t the shader but the image I fed to it. I didn’t realize because everything was generated at runtime and I used the same settings as with the URP version, but somehow the image to dilate didn’t contain any alpha values when rendered with the Built-In pipeline.