Weird shader behavior

Hello community!

I’m stuck with a simple shader I’m working on. The shader is intended to render fragments that are not far then N pixels from triangle’s first vertex. I use geometry shader to project first vertex into screen space and pass it to the fragment shader. In fragment shader I calculate distance between the fragment and the vertex (both in screen space), then draw the fragment with black color, if it’s near the vertex. Other fragments are rendered with red.

Pretty obvious that if everything worked fine, I would see my model with black dots in fixed positions (near the first vertex of each triangle) on the model’s surface. However, what I see depends on the view angle. The black dots “rotate” around first vertices when I move the camera:

1361630--67805--$1.png

1361630--67806--$2.png

1361630--67807--$3.png

1361630--67808--$4.png

Interesting that in preview window the shader produces more believable results then in scene view.

Please help! Why is that so? I think I misunderstand some important point in all this stuff… The shader code below:

Shader "Test" 
{
	SubShader 
	{
		Pass 
		{
			Blend SrcAlpha OneMinusSrcAlpha, SrcAlpha DstAlpha
			BlendOp Add
			ZWrite On
			ZTest LEqual

			CGPROGRAM

			#pragma only_renderers d3d11
			#pragma target 4.0

			#include "UnityCG.cginc"

			#pragma vertex vert
			#pragma geometry geom
			#pragma fragment frag

			struct VsInput
			{
				float4 pos: POSITION;
			};

			struct GsInput
			{
				float4 pos: POSITION;
			};

			struct PsInput
			{
				float4 pos: SV_POSITION;
				noperspective float2 firstVertex: TEXCOORD0;
			};

			float2 projectToWindow(in float4 pos)
			{
				return float2(_ScreenParams.x * 0.5 * (1.0 + pos.x / pos.w),
							  _ScreenParams.y * 0.5 * (1.0 - pos.y / pos.w));
			}

			GsInput vert(VsInput input)
			{
				GsInput output;
				output.pos = mul(UNITY_MATRIX_MVP, input.pos);
				return output;
			}

			[maxvertexcount(3)]
			void geom(triangle GsInput input[3], inout TriangleStream<PsInput> outStream)
			{
				PsInput output;
				output.firstVertex = projectToWindow(input[0].pos);

				output.pos = input[0].pos;
				outStream.Append(output);

				output.pos = input[1].pos;
				outStream.Append(output);

				output.pos = input[2].pos;
				outStream.Append(output);
			}

			float4 frag(PsInput input) : SV_Target
			{
				// not far than 2 pixels from vertex
				return float4(length(input.pos.xy - input.firstVertex) < 2 ? 0 : 1, 0, 0, 1);
			}

			ENDCG
		}
	} 
	FallBack "Diffuse"
}

My mind is blown. The shader works fine in standalone build, but doesn’t work in editor. Moreover, it behaves differently in scene view and game mode:

Standalone build (works as intended):

1361758--67829--$7.png

Still relevant

up

Try player settings, rendering path: deferred

(at least works better on my pc then)

That really helped, the shader now works fine in editor with deferred rendering path. But why? Is it a bug in Unity? Or I probably don’t understand what _ScreenParams really means, because the problem seems to come from how projectToWindow is implemented.