Pixel or scene depth in iPhone shader?

Hi all,

I’m trying to get to grips with ShaderLab programming (my only experience is with node based shader networks like in Maya or UDK) and I was wondering if it’s possible to get or use the pixel or scene depth in a shader on iPhone (using OpenGL ES 2.0)?

I’m trying to create something like the attached image - a shader that fades between two colors based on distance from the camera. I’ve searched as much as I can and have found some code snippets but I don’t yet understand them and can’t get them to work.

Is getting pixel or scene depth possible on the iPhone and, if so, could someone please point me towards the relevant ShaderLab functions that I should be looking into?

Many thanks

=>

You will need to write a Cg or GLSL shader (Cg will be translated to GLSL for iPhone, but can be tested on PC) that passes the calculated depth to the fragment shader where you can convert it into a colour. I’m not actually sure if OpenGL ES 2.0 lets you read the fragment position from POS, or if you have to pass it as a separate member.

i’m looking for just this shader - and a hight map/shader too. any progress on this shader so far please?

I did vertex distance fog in iPhone4S,very fast & reliable.
you can calculate depth value in cg vertex shader

float fogz = mul(UNITY_MATRIX_MV, v.vertex).z;

store this value in TEXCOORD & output it in fragment shader will work.

Let me know if you want some more help.