Search Unity

why OSX cannot write depth per fragment?

Discussion in 'Shaders' started by mradfo21, Jul 29, 2015.

  1. mradfo21

    mradfo21

    Joined:
    May 16, 2013
    Posts:
    194
    in the depth buffer documentation i see this:

    • On OpenGL (Mac OS X), depth texture is the native OpenGL depth buffer (see ARB_depth_texture).
      • Graphics card must support OpenGL 1.4 or ARB_depth_texture extension.
      • Depth texture corresponds to Z buffer contents that are rendered, it does not use the result from the fragment program.
    why is this a limitation? its insanely frustrating!

    gl_fragDepth is supported in ALL versions of OpenGL!

    https://www.opengl.org/sdk/docs/man/html/gl_FragDepth.xhtml

    can anyone explain why they would take this incredibly useful feature away?
     
  2. FuzzyQuills

    FuzzyQuills

    Joined:
    Jun 8, 2013
    Posts:
    2,871
    I suggest trying it anyway, just to see if it works. I have heard several backwards things about macs, and some of them have taught me; try anyway, they often say one thing when it is the other way round! :D