Water Shoreline Effects Not Working

I’m trying to get my custom water shader working with the URP but I’m stuck on the shoreline effects.

In the working version with the built-in render pipeline, I calculate the screenPos (a float4) in the vertex shader like this:
output.screenPos = ComputeScreenPos(UnityObjectToClipPos(v.vertex));
and then the “depth” of the water at the current pixel is calculated in the fragment shader like so:

half2 uv = input.screenPos.xy / input.screenPos.w;
half lin = LinearEyeDepth(tex2D(_CameraDepthTexture, uv).r);
half dist = input.screenPos.w - _FoamWidth;
half depth = lin - dist;

When I put my shader into a project using the URP, it renders pink. There’s a lot of stuff in there, so I decided to take Unity’s standard lit shader and duplicate it to get a “blank slate”, and I attempted to modify that to reproduce my water effects.

However, UnityObjectToClipPos isn’t available to use in there. I then noticed this code in the URP standard lit fragment shader:
VertexPositionInputs vertexInput = GetVertexPositionInputs(input.positionOS.xyz);
Based on the code that follows it, I realized VertexPositionInputs has a positionCS property, which I assumed to be the clip space position, aka what I’m looking for, so I assigned it to output.screenPos.
I have no idea if this is correct, so this might be the cause of the problem.

In the URP fragment shader, I had to change some things around since LinearEyeDepth takes different parameters now:

half2 uv = input.screenPos.xy / input.screenPos.w;
#ifdef UNITY_REVERSED_Z
    half lin = LinearEyeDepth(1 - SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_CameraDepthTexture, uv).r, _ZBufferParams);
#else
    half lin = LinearEyeDepth(SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_CameraDepthTexture, uv).r, _ZBufferParams);
#endif
half dist = input.screenPos.w - _FoamWidth;
half depth = lin - dist;

According to what I’ve read, LinearEyeDepth is supposed to internally handle cases where the Z is reversed, but I found this was not the case.
The depth texture and sampler are defined like so:
TEXTURE2D(_CameraDepthTexture); SAMPLER(sampler_CameraDepthTexture);

I would have thought that this would properly calculate the depth so I could color each fragment based on it, but it appears that the depth all the way across the plane that I attached the material to is 0—as though the water plane is writing to the depth buffer, despite it being transparent.

I’ve scrounged this together based on very little solid info, so I’m honestly not sure if any of this is correct. I tried to explain the problem as well as possible without dumping a huge amount of code here, so please don’t hesitate to ask if there’s more info you need from me!

It might also be worth mentioning that I’m displacing the vertices (to simulate waves), although I don’t think that’s part of the issue.

Any help would be greatly appreciated :slight_smile:

Hi! Did you found solution?

I did actually. I figured this out quite a while ago, so I can’t actually remember if anything else was involved in working this out, but this is the code I have now:

half2 uv = v.screenPos.xy / v.screenPos.w;
half lin = LinearEyeDepth(SAMPLE_TEXTURE2D(_CameraDepthTexture, sampler_CameraDepthTexture, uv).r, _ZBufferParams);
half dist = v.screenPos.w - _FoamWidth;
half depth = lin - dist;

screenPos is calculated in the vertex shader like this:
output.screenPos = ComputeScreenPos(vertexInput.positionCS);
The depth texture and texture sampler are defined like this:
TEXTURE2D(_CameraDepthTexture); SAMPLER(sampler_CameraDepthTexture);

Note that I’m still piggybacking off of the URP standard lit shader, which is where values like vertexInput.positionCS are coming from.