Writing to depth buffer. How to convert from DX to OpenGL?

I’m writing to the depth buffer in a fragment shader (ray marcher) that works great on DX9 and DX11. The shader mostly works on OpenGL, the depth writing does have an effect, but the depths aren’t correct.

Here’s the bottom of the fragment code, where the z buffer value is calculated from a world position. Can anyone tell me how to make this work correctly with OpenGL/GLSL?

C2E2f_Output o;
 o.col = float4(color,1.0);
float4 projPos = mul(UNITY_MATRIX_VP, float4(impactWorldPosition, 1.0));
o.dep = projPos.z / projPos.w;
return o;

Behavior is the same on a mac with an intel gpu and a windows machine with a recent nvidia card (with the editor forced to run in opengl). I’ve tried doubling the depth before subtracting 1 from it (to convert to a -1 to 1 range), but that didn’t do the trick.

Pragmas used in the shader:

	SubShader {
		Tags { "Queue" = "Geometry+1" }
	
		Pass{
			ZTest LEqual
			Blend SrcAlpha OneMinusSrcAlpha
		
			Fog { Mode Off }
			CGPROGRAM
			#pragma target 3.0
			#pragma glsl
			#pragma exclude_renderers gles flash
			#pragma vertex vert
			#pragma fragment frag
//			#pragma profileoption NumInstructionSlots=8192
			#pragma profileoption NumMathInstructionSlots=2048
			
			#include "UnityCG.cginc"

This seems to be working. Not sure if it is reducing my depth buffer precision under opengl though:

#if SHADER_API_OPENGL
				o.dep = depth*.5 + .5;
#else
				o.dep = depth;
#endif