Search Unity

Pixel or scene depth in iPhone shader?

Discussion in 'Shaders' started by Amael, Jan 12, 2011.

  1. Amael

    Amael

    Joined:
    Jan 12, 2011
    Posts:
    1
    Hi all,

    I'm trying to get to grips with ShaderLab programming (my only experience is with node based shader networks like in Maya or UDK) and I was wondering if it's possible to get or use the pixel or scene depth in a shader on iPhone (using OpenGL ES 2.0)?

    I'm trying to create something like the attached image - a shader that fades between two colors based on distance from the camera. I've searched as much as I can and have found some code snippets but I don't yet understand them and can't get them to work.

    Is getting pixel or scene depth possible on the iPhone and, if so, could someone please point me towards the relevant ShaderLab functions that I should be looking into?

    Many thanks

    $depth2.jpg => $depth1.jpg
     
  2. Daniel_Brauer

    Daniel_Brauer

    Unity Technologies

    Joined:
    Aug 11, 2006
    Posts:
    3,355
    You will need to write a Cg or GLSL shader (Cg will be translated to GLSL for iPhone, but can be tested on PC) that passes the calculated depth to the fragment shader where you can convert it into a colour. I'm not actually sure if OpenGL ES 2.0 lets you read the fragment position from POS, or if you have to pass it as a separate member.
     
  3. URAwesome

    URAwesome

    Joined:
    Sep 28, 2013
    Posts:
    23
    i'm looking for just this shader - and a hight map/shader too. any progress on this shader so far please?
     
  4. colin299

    colin299

    Joined:
    Sep 2, 2013
    Posts:
    181
    I did vertex distance fog in iPhone4S,very fast & reliable.
    you can calculate depth value in cg vertex shader
    Code (CSharp):
    1. float fogz = mul(UNITY_MATRIX_MV, v.vertex).z;
    store this value in TEXCOORD & output it in fragment shader will work.

    Let me know if you want some more help.