Why is Depth different in OpenGL?

I get different results with the CameraMotionBlur image effect, the Camera Velocity texture seems to be generated inaccurately on OpenGL hardware (Linux).
The velocity from when the camera moves to a new position seems to have a very large magnitude compared to DirectX.

From looking into it, it seems to be something to do with the prevClipPos and clipPos… but I’m not sure.

The clipPos gets Z set from the DepthTexture, and is multiplied by the _ToPrevViewProjCombined matrix to get the prevClipPos. It seems like the depth & z position coming from that _ToPrevViewProjCombined matrix are not in the same range (bias?), is it to do with OpenGL’s depth buffer being -1 to 1, while DirectX is 0 to 1 ?

Any helps appreciated!

EDIT

For reference, I’m using the CameraMotionBlur .js & .shader which is a standard image effect.

I’ve modified it to only display the CameraVelocity Pass’s output, and this is where I’m seeing differences between OpenGL and DirectX hardware (Linux vs Win)

turns out it is Unity “blacklisting” the Intel GPU in this Steambox (Intel Iris Pro 5200). I had to hex edit the executable removing the name ‘Intel’, and this solved all my rendering issues.

http://forum.unity3d.com/threads/image-effects-and-shadows-disabled-on-ubuntu.212428/