Search Unity

Simple shader to write to z-buffer works on Windows but not Mac. Anyone know why?

Discussion in 'Shaders' started by Spuddicus, Jun 21, 2017.

  1. Spuddicus

    Spuddicus

    Joined:
    Aug 11, 2014
    Posts:
    15
    I need a shader that writes a constant depth value to the z-buffer for each pixel belonging to a simple quad. The idea is that I render a 2D background image first to fill the screen, then cut out various square holes in it using this shader I'm attempting to write and then finally render some 3D objects into the holes. I don't know much about shaders but I've cobbled together something that works fine in the Unity editor on my Windows machine but doesn't seem to write to the z-buffer on my Mac. Does anybody know why or have some suggestions to try? Thanks!

    Code (csharp):
    1. Shader "Custom/Erase Z-Buffer"
    2. {
    3.     SubShader
    4.     {
    5.         Tags{ "Queue" = "Geometry-1" }
    6.  
    7.         ZWrite On
    8.         ColorMask 0
    9.         ZTest Always
    10.  
    11.         Pass
    12.         {
    13.             CGPROGRAM
    14.             #pragma vertex vert
    15.             #pragma fragment frag
    16.             #include "UnityCG.cginc"
    17.  
    18.             struct v2f
    19.             {
    20.                 float4 position : POSITION;
    21.             };
    22.  
    23.             struct fragOut
    24.             {
    25.                 float depth : DEPTH;
    26.             };
    27.  
    28.             v2f vert(appdata_base v)
    29.             {
    30.                 v2f o;
    31.                 o.position = UnityObjectToClipPos(v.vertex);
    32.                 return o;
    33.             }
    34.  
    35.             fragOut frag(in v2f i)
    36.             {
    37.                 fragOut o;
    38.                 o.depth = -100000;
    39.                 return o;
    40.             }
    41.             ENDCG
    42.         }
    43.     }
    44. }
    45.  
     
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Generally when you clear the depth would use a value of 1. However, as you may have noticed, you needed a negative number for this to work on Windows. As for why it's not working on Mac, the difference is really between DirectX and OpenGL. Unity is using a trick to increase the depth accuracy, but because of differences between DirectX and OpenGL, the trick doesn't work with OpenGL. That trick Unity is using is to reverse the depth so that near is 1, and far is 0, where as normally it would be near at 0 and far at 1. As for why you would do that, you can search for reversed depth buffer precision on your favorite internet search engine. They don't do it for OpenGL because the depth goes from -1 to 1, and reversing it go from 1 to -1 wouldn't do anything useful.

    What that means is for both DirectX and OpenGL the far depth is normally 1, but with sometimes with DirectX Unity uses a far depth of 0.

    Luckily, there's already a value you can use to know if the depth is reversed or not. So, the solution is quite simple:

    #if defined(UNITY_REVERSED_Z)
    o.depth = 0;
    #else
    o.depth = 1;
    #endif


    Now you might still be wondering why -10000 worked on Windows, but did nothing at all Mac if the range is 1 to 0, or -1 to 1? I believe this is because -100000 is in front of the camera still on Windows, but gets clamped to 0, where as -100000 on Mac is behind the camera, and might thus be getting thrown away before it would have been clamped to -1. That's just a guess on my part though.
     
    Claytonious and Spuddicus like this.
  3. Spuddicus

    Spuddicus

    Joined:
    Aug 11, 2014
    Posts:
    15
    That worked! Awesome, thanks a million!