"Hole" shader in URP.

I’d like to make a “hole” shader in URP.

In builtin shader pipeline it can be done this way:

Shader "MaskShader"
{
    Properties{
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" "Queue"="Geometry-1"}
        LOD 100

        ColorMask 0

        Pass{
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag

            #include "UnityCG.cginc"

            struct appdata{
                float4 vertex : POSITION;
            };

            struct v2f{
                float4 vertex : SV_POSITION;
            };

            v2f vert (appdata v){
                v2f o;
                o.vertex = UnityObjectToClipPos(v.vertex);
                return o;
            }

            fixed4 frag (v2f i) : SV_Target{
                fixed4 col = 1.0;
                return col;
            }
            ENDCG
        }
    }
}

Setting color write to 0 disables color writes and as a result the shader writes depth only, and because its queue is Geometry-1, it is rendered before any geometry but after the skybox.

So, placing a box with this shader punches a hole in the scene and creates a “portal” through which you can see the skybox.

The important part it also works in AR the same way.

In URP, however, if I try to use shadergraph, then I don’t have access to color mask.
And if I don’t use shadergraph, I get “hall of mirrors” effect where the “mask” object does punch hole in the scene, but the screen behind it is not initialized and displays previous frame.

Can I do something about that? how can this be converted to URP and can it be converted to URP in the first place?

@bgolus Would something like this be of interest to you?

I would also love to know the anwser

The behavior you described for the URP is what it looks like for the built in rendering path too… in the game view. The skybox only renders first in the scene view, so this was never a viable way to achieve this effect.

Stencils could work. But those also aren’t exposed to shader graph.

The way I do this kind of thing is I make a shader that displays the skybox on its surface. Easy if the skybox is a cube map.

Alternatively you can use camera stacking to force the skybox to render first and then your existing shader will work.

1 Like

I actually used this in practice, it works, due to skybox being placed in earlier queue. Skybox is “background”, if camera clears with skybox, it works in builtin, because the game draws the camera background first. It doesn’t work in the scene view.

Obviously it is not going to work in situation when skybox is actually a 3d object.

I need this to work in AR as well, and somehow I couldn’t find the “Passthrough material” in AR Foundation, although common sense says it definitely should be a thing.

This one should work, thanks.

This was true during Unity 4.0 and before. That has not been true since Unity 5.0 on. And isn’t true for either of the SRPs. The skybox renders after opaques for the BIRP, URP, and HDRP.

BIRP scene view, works as expected. Because the skybox renders first.
7545652--932401--upload_2021-10-4_9-37-40.png

BIRP Game view using camera set to clear to skybox, hall of mirrors. Because the skybox renders after opaques.
7545652--932404--upload_2021-10-4_9-38-33.jpg

The URP and HDRP have the same behavior, because they both render the skybox after opaques at the far clip plane.

Using camera stacking works because you can force the skybox the render first. Alternatively you can render the skybox manually by having it on an object in the world and setting that object’s material’s queue to background. Which is what I usually suggest for anyone doing AR stuff.

But again, if your skybox is just a cubemap, or in the case of AR a screen space projection, it’s much easier to just have an shader that renders that same texture that you can put on arbitrary geometry to fake punch through effects like this.

1 Like

Well, all I can say that this is odd, as I’ve been using this sort of shader in built-in in 2020 and it just worked. Also, it worked in Game View, but not in the scene view. Which is the opposite of your example.

I’m very curious as to how. The skybox behavior between the scene view and game view isn’t something you can override from the “user” side and is hardcoded in the c++ side of the engine. And otherwise it’s generally quite difficult to get the scene view and game view to have different behaviors when it comes to render order stuff. This makes me think you were using this on a project that was overriding the skybox with custom rendering code. Likely by disabling the skybox entirely and rendering it manually.

No, it was a normal project… it was an AR project, but this thing reliably worked in both AR view in the scene view.

I’ll dig it up and try to see what’s going on.

Aright.

I’ve tested it, and looks like AR Foundation got me fooled.

AR Foundation camera uses “Color” clear flag, and with “Color” clear the box works normally, meaning it displays background color. Then the camera script performs some magic to display background.

Looking through its scripts it looks like it hijacks background rendering via “IssuePluginEvent”.
https://docs.unity3d.com/ScriptReference/GL.IssuePluginEvent.html

So, like you said. Overridden rendering process.

And without that with just “skybox” clear, it is hall of mirrors.


With all that in mind, looks like the only way to punch hole without stencils would be camera stacking.

Here how to do it in ShaderGraph(use ellipse for shape if you want circular hole):