URP Shader missing in WebGL build, works fine for PC build

Hello,
This is what the project I built for PC looks like, with the shader working:

But for WebGL the shader does not load:

And the gameobjects that have the shader applied to them also tend to disappear. You can see this when running the game here: Monkey Do by Kodiak

I am using 2021.3.23f1 version.
I have added the shader to “Always Include” in the Graphics settings.

Any help is appreciated, thank you.

Hi, is this reproducible when running the editor with -force-glcore?

The editor appears to have the shader but the game window does not:

Maybe opening the Frame Debugger to check what stopped the outline to be rendered?

Here’s what the frame debugger tells me:
Working shader (without opengl):


Not working shader:

Sorry that I’m not sure what could be the cause, but I think you can compare it (and the shader properties page) with results in DirectX to find out the possible reason.

And here is the code for the shader:

Shader "Unlit/SobelFilter"
{
    Properties
    {
        [HideInInspector]_MainTex ("Base (RGB)", 2D) = "white" {}
        _PixelDensity ("Pixel Density", float) = 10
        _Power ("Power", float) = 50
        _PosterizationCount ("Count", int) = 8
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 200
       
        Pass
        {

            Blend SrcAlpha OneMinusSrcAlpha
           
            ZWrite On
            ZTest Always

            HLSLPROGRAM
            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/SurfaceInput.hlsl"
            #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Color.hlsl"
           
            TEXTURE2D(_DepthTex);
            SAMPLER(sampler_DepthTex);

            TEXTURE2D(_CameraDepthTexture);
            SAMPLER(sampler_CameraDepthTexture);

            TEXTURE2D(_CameraColorTexture);
            SAMPLER(sampler_CameraColorTexture);
           
            TEXTURE2D(_MainTex);
            SAMPLER(sampler_MainTex);

            float _PixelDensity;
            int _PosterizationCount;
            float _Power;
           
            struct Attributes
            {
                float4 positionOS       : POSITION;
                float2 uv               : TEXCOORD0;
            };

            struct Varyings
            {
                float2 uv        : TEXCOORD0;
                float4 vertex : SV_POSITION;
                UNITY_VERTEX_OUTPUT_STEREO
            };
           
            float SampleDepth(float2 uv)
            {
                return SAMPLE_DEPTH_TEXTURE(_DepthTex, sampler_DepthTex, uv);
            }
           
            float2 sobel (float2 uv)
            {
                float2 delta = float2(_PixelDensity / _ScreenParams.x, _PixelDensity / _ScreenParams.y);

                float up = SampleDepth(uv + float2(0.0, 1.0) * delta);
                float down = SampleDepth(uv + float2(0.0, -1.0) * delta);
                float left = SampleDepth(uv + float2(1.0, 0.0) * delta);
                float right = SampleDepth(uv + float2(-1.0, 0.0) * delta);
                float centre = SampleDepth(uv);

                float depth = max(max(up, down), max(left, right));
                return float2(clamp(up - centre, 0, 1) + clamp(down - centre, 0, 1) + clamp(left - centre, 0, 1) + clamp(right - centre, 0, 1), depth);
            }
           
            Varyings vert(Attributes input)
            {
                Varyings output = (Varyings)0;
                UNITY_INITIALIZE_VERTEX_OUTPUT_STEREO(output);

                VertexPositionInputs vertexInput = GetVertexPositionInputs(input.positionOS.xyz);
                output.vertex = vertexInput.positionCS;
                output.uv = input.uv;
               
                return output;
            }
           
            half4 frag (Varyings input, out float depth : SV_Depth) : SV_Target
            {
                UNITY_SETUP_STEREO_EYE_INDEX_POST_VERTEX(input);
               
                float2 sobelData = sobel(input.uv);
                float s = pow(abs(1 - saturate(sobelData.x)), _Power);
                half4 col = SAMPLE_TEXTURE2D(_MainTex, sampler_MainTex, input.uv);
                half4 d = SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_CameraDepthTexture, input.uv);
                float x = ceil(SampleDepth(input.uv) - d.x);

                //col.a = x;
                col = pow(abs(col), 0.4545);
                float3 c = RgbToHsv(col);
                c.z = round(c.z * _PosterizationCount) / _PosterizationCount;
                col = float4(HsvToRgb(c), col.a);
                col = pow(abs(col), 2.2);


               
                s = floor(s+0.2);
               
                s = lerp(1.0, s, ceil(sobelData.y - d.x));
                depth = lerp(sobelData.y, SampleDepth(input.uv), s);
                col.rgb *= s;
                col.a += 1 - s;

               
                return col;
            }
           
            #pragma vertex vert
            #pragma fragment frag
           
            ENDHLSL
        }
       
    }
    FallBack "Diffuse"
}

Thank you for your help. I will try that tomorrow if no other solutions come up while I get some rest

I guess the _DepthTex is missing in OpenGL? The name doesn’t sound like URP’s depth texture (_CameraDepthTexture) but you can try ticking “Depth Texture” in URP asset and see if it works.

1 Like

So I’ve found out that it is just the outline part of the shader that is not working in webgl/opengl. But I’m not sure exactly where in the code the issue is…

That kind of worked! Any idea why it only does the outline of all objects and not individually outlining them like in the non opengl version?
9531541--1345297--outline_shader.PNG

I think that’s because the outline shader uses both depth and color to find the edges, and (maybe) the color is broken for some reason on OpenGL.

Correct: Looks like it only uses depth. The large depth diff (background & object) area works fine, but small diff areas are broken.

Is there any way to fix it or check for a diff between the two objects?

I’m not sure if this is the cause, but if it is, you can remap the depth value after the depth texture is sampled.

Depth texture on platforms like DirectX starts from 0, but on OpenGL starts from 1.

float depth = SampleATexture(...);
#if !UNITY_REVERSED_Z
    depth = 1.0 - depth;
#endif

I’m sorry, I’m not sure where to put that code in.

Not sure if it works, for example:

float SampleDepth(float2 uv)
{
    float depth = SAMPLE_DEPTH_TEXTURE(_DepthTex, sampler_DepthTex, uv);
#if !UNITY_REVERSED_Z
    depth = 1.0 - depth;
#endif
    return depth;
}
half4 d = SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_CameraDepthTexture, input.uv);

// Add this
#if !UNITY_REVERSED_Z
    d.x = 1.0 - d.x;
#endif

float x = ceil(SampleDepth(input.uv) - d.x);
1 Like

Thank you so much!!

1 Like