Help Reconstructing a Normal Map from a Swizzled map using a compute shader

I’m trying to reconstruct a normal map texture from a swizzled normal map. These normal maps are generated at runtime (it’s UMA). I’ve attempted to make a compute shader, copying the “UnpackNormalAGorRG” macro.

void NormalConverter(uint3 id : SV_DispatchThreadID)
{
#pragma kernel NormalConverter
Texture2D<float4> Input;
RWTexture2D<float3> Result;
[numthreads(1, 1, 1)]
void NormalConverter(uint3 id : SV_DispatchThreadID)
{
    float2 src = float2(id.x, id.y);
    float4 packednormal = Input[src];
    float3 normal;
    // This do the trick
    packednormal.x *= packednormal.w;
    normal.xy = packednormal.xy * 2 - 1;
    normal.z = sqrt(1 - saturate(dot(normal.xy, normal.xy)));
    Result[id.xy] = (normal * 0.5) + 0.5;
}
}

The only problem is… it doesn’t work. It returns back a normal map that is one shade of blue for everything. I know the texture is being written to (I generated a gradient in it just to verify). I’ve also verified the texture going in is correct (I’ve saved it to disk – it even works as a normal map, if you don’t set it to “normal map” in the import settings).

I’m a newbie at compute shaders, so obviously I’ve done something completely wrong.

Here’s how I’m calling it (normalMap is a standard Texture2D that contains the swizzled normal map).

    RenderTexture normalMapRenderTex = new RenderTexture(normalMap.width, normalMap.height, 24);
            normalMapRenderTex.enableRandomWrite = true;
            normalMapRenderTex.Create();
            normalMapConverter.SetTexture(kernel, "Input", normalMap);
            normalMapConverter.SetTexture(kernel, "Result", normalMapRenderTex);
            normalMapConverter.Dispatch(kernel, normalMap.width, normalMap.height, 1);

in short, this works if you pass it a RenderTexture for input instead of a Texture2D. No idea why that is required.