Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Matching prerendered depth from blender with the unity depth texture

Discussion in 'Shaders' started by Ante, Dec 12, 2013.

  1. Ante

    Ante

    Joined:
    Oct 21, 2010
    Posts:
    48
    Hey! I'm trying to make a full screen shader that will draw a pre-rendered background into the scene using a depth texture stored in the alpha channel of the background to decide whether to draw the background or geometry. I understand I can also do this with depth masks, but I have my reasons for wanting to do it based on a depth texture. The shader works. My only issue right now is getting the unity depth textures and the blender depth textures to line up without any tweaking to make it easier for the artist to make a massive amount of these prerendered backgrounds.

    I've been looking around I found out that Unity's depth texture is generated based on the camera's far clip, which in my case is 50. In my blender scene, I set up mist to start at 0 and end at 50 to create my depth map. I also tried generating a depth map with Z divided by 50 and clamped and got a very similar depth map without anti-aliasing. I've taken these maps into photoshop and they seem to be linear, with 50 units from the camera having a value of 255 and 25 units from the camera having a value of 128. So, as far as I can tell, this seems to be what I want, linear, beginning at 0 units and ending at 50 units.

    Unity's depth texture, on the other hand, I can't really figure out. I think maybe I'm doing something wrong in the shader. I'm not a shader expert at all. It seems to start at 0 and end past 50, and it also doesn't seem to be linear.

    This shader with _Intensity set to 1 (Unity is on the left):
    Code (csharp):
    1.     half4 fragPrerenderedDepth (v2f i) : COLOR {
    2.         half4 toAdd = tex2D(_Overlay, i.uv[0]) ;
    3.         half4 orig = tex2D(_MainTex, i.uv[1]);
    4.         float d = UNITY_SAMPLE_DEPTH (tex2D (_CameraDepthTexture, i.uv[1]) );
    5.         d = Linear01Depth (d)*_Intensity;  
    6.        
    7.         if(i.uv[1].x<_Blind)
    8.             return d;
    9.         else
    10.             return toAdd.a;
    11.     }  
    Draws this image:
    $depth1.png

    Changing _Intensity to 1.55 to try to match the images up draws this:
    $depth2.png

    The Unity depth also seems to start at 0, but it doesn't seem to be linear? Though I got the background wall to line up between the two images, the boxes and foreground don't line up at all, and the ground's gradient is way different from the blender map.

    So anyway, there is my problem. I need to get unity's depth texture to match blender's or vice versa. Either I get Unity to use the same linear depth as blender or I add some extra math nodes to Blender to get it matching up with Unity. Preferably, I'd like unity to match up with blender as it seems blender's range seems to be correct. I'm probably just using the depth texture wrong in my shader. Thanks in advance for your help!
     
  2. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    369
    Depth buffers are logarithmic in Unity. You get a higher precision up front vs far back. If you need it to match blender you might need to make your own depth shader. You can do a -mul(v.vertex,UNITY_MATRIX_MV).z * _ProjectionParams.w to get a linear depth from 0 to 1
     
  3. Ante

    Ante

    Joined:
    Oct 21, 2010
    Posts:
    48
    Thanks, that helps. How would I get this info into a full screen shader? The code you linked is a vertex shader, right? Would I need a full screen shader that renders the scene out with a specific vertex shader that draws the geometry's depth, or could I do this all in the full screen shader? Again, I'm pretty new to shaders; especially full-screen shaders.
     
  4. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    369
    You would have to use RenderWithShaders function.

    http://docs.unity3d.com/Documentation/Components/SL-ShaderReplacement.html

    sumthin like this.

    depthRenderCamera.CopyFrom(mainCam); //copy settings
    depthRenderCamera.targetTexture = mySpecialDepthRenderTexure;
    depthRenderCamera.RenderWithShader(depthShader, "");

    Blender can't output a logarithmic depth texture? Or maybe you are loosing precision since depth textures often use floating point textures.
     
  5. brianasu

    brianasu

    Joined:
    Mar 9, 2010
    Posts:
    369
    v2f vertTexture( appdata_base v ) {
    v2f o;
    o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
    o.depth = -mul(UNITY_MATRIX_MV, v.vertex).z * _ProjectionParams.w; // gives depth
    return o;
    }

    half4 frag(v2f i) : COLOR {
    return half4((i.depth) , 0, 0, 1);
    }

    If you are handling things like transparent cutout, grass and others you have to incorporate that into your depth shader.
    If you download the builtin unity shaders there is a shader called something like Camera-DepthTexture.shader. That shows how unity renders the depth map internally and handles other non opaque objects.


    I didn't notice but you are the nyan cat guy?! I tried the Desert Bus marathon once. Couldn't make the cut:)