Does Unity support triangleadj in geometry shaders?

Yup; that’s the problem :-). Which is why I think I need a spatial hash or something so that I can detect edges that don’t share actual vertices, just position.

Just to follow up, I am generating edge adjacency in a job here, which you can pass as a buffer to a compute shader. This could be adapted to other purposes that need triangleadj since the output is just the edge indices:

https://gitlab.com/burningmime/burning-rpg-framework/-/blob/master/Assets/src/graphics/ShadowEdgeJob.cs

It seems MUCH slower to access and generate vertices in a compute shader than a geometry shader (at least on my GPU). I think it’s because the input assembler does some sort of magic with the vertex and index input to the VS/GS, while the CS needs to look up indices and then positions from within the shader code itself.

Basically, something like this introduces latency that a geometry shader would not have:

ByteAddressBuffer _vertices;
ByteAddressBuffer _indices;
uniform uint _vertexStride;

[numthreads(64,1,1)]
void someComputeShader(uint3 threadIds : SV_DispatchThreadID)
{
    int3 tri = _indices.Load3(threadIds.x * 12);
    float3 a = asfloat(_vertices.Load3(tri.x * _vertexStride));
    float3 b = asfloat(_vertices.Load3(tri.y * _vertexStride));
    float3 c = asfloat(_vertices.Load3(tri.z * _vertexStride));
    // ...
}

A GS with triangle input would get that part for free.