How to calculate screen space texcoords for surface shader?

I’m trying to apply a texture in screen space in a surface shader for a dither effect. Since the VPOS semantic doesn’t seem to be available to surface shaders, I added a vertex shader to my surface shader to try and calculate the screen coords, but I always get perspective distortion on my texcoords no matter how I calculate them. See the attached screenshot, the dither pattern is in perspective when it should be aligned with the window.

I’m calcuating the coordinates like so:

	struct Input {
		float2 uv_MainTex;
		float4 screenCoord;

	void vert(inout appdata_full v, out Input o) {

		float4 clipPos = UnityObjectToClipPos(v.vertex);
		o.screenCoord = ComputeScreenPos(clipPos);

It also doesn’t matter if I multiply by UNITY_MATRIX_MVP instead of using UnityObjectToClipPos. If I directly using clipPos as the screenCoord I get a similar result, but with the texture being anchored to the center of the window rather than the bottom left. So it seems like I’m not getting the correct transformation when I transform v.vertex to clip space, but why?

In a surface shader, things such as screen coordinates are already available for use; you just have to declare them. You don’t even need the vertex function.

In your Input struct, add the following;

float4 screenPos;

That’s it. Keep in mind that most screen position computations will have perspective distortion, which you can fix by simply dividing by the w component.

For example, here’s a float2 you can use as screen coords;

float2 coords = IN.screenPos.xy / IN.screenPos.w;