Geometry Shader Triangle Adjacency buffer

I’m working on a shader that adds an extruded shell effect to an object by manipulating the geometry of the object in a second pass and am unsure how to create/populate/access the triangle adjacency buffer. I’m specifically not using vertex += normal * extrusion in the vertex shader as I’d like to use an averaged normal for the extrusion direction allowing seams to be maintained.

Deployment is to a controlled environment consisting of Win10 based systems running Titan X’s so iOS, and mobile support isn’t needed.

My shader capabilities and understanding are somewhat limited, so be gentle.

//Note: Other properties, Subshaders, and passes excluded for brevity
Properties{
    [Toggle(APPLY_SHELL)] _ApplyShell("Apply Shell", Float) = 0
    _ShellDarkenColor("Darken Color", Color) = (1,1,1,1)
    _Extrusion("_Extrusion", Range(0,0.2)) = 0.075
}

//Shell Layer 1 (would be great to be able to skip an entire pass based on a toggle)
Pass{
    Name "Darken"
    Zwrite off
    Blend dstcolor zero
    Cull Off

    CGPROGRAM
    //Can go higher if needed
    #pragma target 4.0
    #include "UnityCG.cginc"

    #pragma fragmentoption ARB_precision_hint_fastest   
    #pragma vertex ShellDarkenVert
    #pragma geometry ShellDarkenGeo
    #pragma fragment ShellDarkenFrag

    struct vertdata{
        float4 vertex : POSITION;
        float2 uv : TEXCOORD;
        float3 normal : NORMAL;
    };

    //Vert to Geo
    struct v2g{
        float4 vertex : SV_POSITION;
        float2 uv : TEXCOORD0;
        float3 normal : NORMAL;
    };

    //Geo to Frag
    struct g2f{
        float2 uv : TEXCOORD0;
        float4 vertex : SV_POSITION;
    };

    v2g ShellDarkenVert(vertdata v) {
        v2g o;
        o.vertex = v.vertex;
        o.uv = v.uv;
        o.normal = v.normal;
        return o;
    }
               
    /* Unsure how to go about setting this up for triangle adjacency. */
    [maxvertexcount(3)]
    void ShellDarkenGeo(triangle v2g IN[3], inout TriangleStream<g2f> triStream){
        g2f o;
                   
        for(int i = 0; i < 3; i++){
            o.vertex = UnityObjectToClipPos(IN[i].vertex + float4(normal, 0.0) * _Extrusion);
            o.uv = IN[i].uv;
            triStream.Append(o);
        }
         
        triStream.RestartStrip();
    }
               
    fixed4 ShellDarkenFrag(g2f i): COLOR{
        if(_ApplyShell > 0){
            return _ShellDarkenColor; //Color blending excluded for brevity
        }else{
            return float4(1,1,1,0);
        }
    }
    ENDCG
}

You can’t. Adjacency data is not supported by Unity.

More to the point, it isn’t supported by any major game engine. Adjacency data only really exists in academic papers, and maybe a few hobby or custom engines. If you come across some technique that’s using a geometry shader with adjacency data, ignore it and look for another solution. A lot of academic papers that use adjacency data that I’ve seen don’t post any useful performance metrics. The few that do if you actually look at them often show tens or even hundreds of milliseconds per frame to render a single mesh that would otherwise be a fraction of a millisecond to render. And some of those papers are showing times on high end GPUs, similar in performance to the Titan X.

I know these kinds of techniques are really tempting, but they’re ultimately wastes of time to pursue.

The “real” way to do this is with compute shaders.

Well that sucks, I’m not familiar at all with compute shader implementation to even take a guess.

An alternate approach I’ve been mulling over is to Blit together three rendered layers. A background silhouette, the standard shader layer, and one that’s an overlay the same size of the silhouette that blends in color based on observation angle & surface normal but that’s going to require integrating a second camera and using culling masks to create silhouettes. (Also comes with a whole bunch of instanced single-pass VR support issues.)

Do you know of an alternate approach that can be used to extrude the faces of the mesh while keeping it watertight?

Pre-compute an averaged normal and store it in either the color channel if you’re not using it, or UV4 of the mesh (unity will use uv2/3 for lightmapping, if all lightmapping is disabled then you can store in one of those instead).
Then you can access this as your “normal” direction in the shader instead of using the actual .normal.

You can make an AssetPostprocessor script to have this automatically happen to models on import (can constrain it to a specific folder or models with a given label on them).

Or you can buy Toony Colors Pro 2 which has a tool to do just that.

Oh, one thing of note. Adjacency data wouldn’t have fixed the problem your trying to fix anyway. Adjacency data for triangles gives you the surrounding triangles that share the vertices of that triangle. The whole problem with hard edged normals is that there’s a seam; the nearby triangles do not share the same vertices across the hard edge. So even in the case adjacency data was available, it wouldn’t be across the edges you want.

The way things like Toony Colors Pro 2 work is they compare all of the vertices to each other to find ones that are very close, but not connected. Some go as fast as to check vertices against triangle edges. But this is an expensive and time consuming process, and one you don’t want to do more then once.