Search Unity

Shader not rendering to camera depth and I cant work out why.

Discussion in 'Shaders' started by Noisecrime, Jul 22, 2011.

  1. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,054
    Edit: Fixed the problem, see my reply below


    Original Post:
    So i've written an Image effect that uses the camera depth buffer to determine what to render, if the linear depth > 0.9 it renders a texture, if its less it renders the results from the camera. It all works as expected except when I add a plane with the following shader attached (see below). For some reason this shader refuses to render to the zbuffer and I can't work out why.

    The image effect is applied to the camera and uses OnRenderImage. In the image effect shader if I output only the source RenderTexture I see the scene from the camera as you normally would and the plane is present. If I output only the camera depth then other objects are visible in the depth buffer, but the plane is not! This confirms to me that the plane is being rendered before the image effect or use of the camera depth so it should be in the depth buffer.

    So everything is working, i'm setting the camera depthTextureMode etc. Its just applying the shader below no longer renders to the zbuffer. Change the shader on the plane to say standard diffuse material and the plane is visible in the camera depth buffer again.

    Anyone got any ideas?


    Code (csharp):
    1. Shader "ncp/openNI/kinectLabelShader"
    2. {
    3.     Properties
    4.     {
    5.         _MainTex ("Base (RGBA)", 2D) = "white" {}
    6.     }
    7.  
    8.  
    9.     SubShader
    10.     {
    11.         Tags {"Queue" = "Geometry" }
    12.         Pass
    13.         {
    14.             ZTest Always Cull Off ZWrite On
    15.             Fog { Mode off }
    16.                    
    17.             CGPROGRAM
    18.                 #pragma vertex vert
    19.                 #pragma fragment frag
    20.                 #pragma fragmentoption ARB_precision_hint_fastest
    21.                 #include "UnityCG.cginc"
    22.  
    23.                 struct v2f {
    24.                     float4  pos : SV_POSITION;
    25.                     float2  uv : TEXCOORD0;
    26.                 };
    27.        
    28.                 float4 _MainTex_ST;
    29.        
    30.                 v2f vert (appdata_base v)
    31.                 {
    32.                  v2f o;
    33.                  o.pos = mul (UNITY_MATRIX_MVP, v.vertex);
    34.                  o.uv = TRANSFORM_TEX (v.texcoord, _MainTex);
    35.                  return o;
    36.                 }
    37.                
    38.                 uniform sampler2D _MainTex;
    39.                 float4 output;
    40.                
    41.                 float4 frag (v2f_img i) : COLOR
    42.                 {
    43.                     float4 pixcol = tex2D(_MainTex, i.uv);                                 
    44.                     int label = (pixcol.r*15.0*4096.0 + pixcol.g*15.0*256.0 + pixcol.b*15.0*16.0 + pixcol.a*15.0);
    45.                     output.r = 1.0 * (label % 4);
    46.                                    
    47.                     output.a = 0.0;  
    48.                     return (output);
    49.                 }
    50.             ENDCG
    51.         }
    52.     }
    53.  
    54.     Fallback off
    55. }
     
    Last edited: Jul 22, 2011
  2. aubergine

    aubergine

    Joined:
    Sep 12, 2009
    Posts:
    2,880
    This is definitely strange, it also happens with builtin transparent shaders. Veery strange indeed.

    EDIT: You said the above shader you posted is an object shader and you have another post process effect for rendering depth and stuff right?

    If so, first of all your frag input is wrong, it should be v2f.

    But anyways i tried with other shaders and still transparent objects dont come up in depth view, and probably theres a good explanation for it.

    EDIT2: Documents say this:
     
    Last edited: Jul 22, 2011
  3. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,054
    Thanks for looking.

    Yes the above shader is just a simple no lighting shader to display a 16 bit texture. I have a different shader for the image effect, that uses the camera scene and depth. Pretty sure this is nothing to do with the image effect and currently its just outputting either the camera source image (via OnRenderImage) or the camera depth (as linear01depth). In the scene I have a sphere that rotates/orbits and that is visible in both the camera source image and the depth buffer, however the plane with the above shader only shows in the camera source image.

    As for transparent objects not coming up in depth view that is to be expected as they shouldn't render to the depth buffer. However what makes you think the shader is 'transparent'? As its not meant to be. If there is something in the shader that forces it to transparent then that would explain the cause of the issue, but I can't see anything,

    Good catch on the v2f, though as far as I can tell it should make no difference as v2f_img happens to have the same struct.

    My first instinct to it not showing was to assume I'd somehow made it a 'transparent' shader as that would not render to the depth buffer, but I can't see anything in the shader code that would do this.

    Edit:
    Interesting, if I set the renderer to use deferred, the plane with the above shader on suddenly appears, working as expected. However it still fails with forward rendering.


    Edit2: Fixed the Problem.
    By chance I stumpled upon the solution in another shader I was working on.
    Code (csharp):
    1. Tags { "RenderType"="Opaque" }
    I guess I made assumptions that placing objects in the geometry queue, automatically made them opaque or that not specifying 'rendertype' defaulted to opaque.

    However reading through about 'RenderType' in the shaderlab docs, it suggests that this is only used for shaderReplacement. Am I to infer that use of the camera depth buffer is causing the scene (unless in deferred mode I guess) to render twice, once to produce the depth buffer ina renderTarget I guess? That not having the renderType tag in the shader caused it not be rendered to the depth buffer renderTarget?
     
    Last edited: Jul 22, 2011
  4. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    I've had odd issues with RenderType Opaque before with depth textures. Changing it to AlphaTest fixed my issue.

    It does seem that Unity does more behind the scenes with it than is documented.