Search Unity

Water4 - Writing to Depth Buffer

Discussion in 'Shaders' started by sgoodrow, Aug 18, 2014.

  1. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    I am using the Water4 shader that came with Unity Pro and would like to render the surface of the water to the depth buffer after the water is done using the current state of the depth buffer to render itself.

    I've tried to do this in a few different ways but haven't had much success, so hopefully someone can illuminate me on why that is, or how to do it. One caveat is that I need to render the water before transparent geometry, as I have depth-dependent image effects that use the [ImageEffectOpaque] attribute to render them beforehand.

    First approach: Assign the Water4 shader to a different RenderType than the initial "Transparent", like "Water", and add a RenderType listing for "Water" to an overriden version of the "Hidden/Camera-DepthTexture" shader. In the replacement pass for the "Water" RenderType, use ZWrite On. Unfortunately this did not work, the result being that the initial rendering of the water looks nearly invisible, as the "depth" of the water is apparently nearly zero. So the replacement pass is taking place before the water is rendered? I suppose that could make sense, an object needs to render its own depth to the depth buffer before it can be drawn? Is that true?

    Second approach: Create a second shader which will render right after the Water4 shader in the rendering queue, using the same mesh, this time with ZWrite On. Again this did not work, with the result being the same problem as before, though this time I don't have any guesses as to why except for maybe ZTesting? If the second material is at the same depth from the camera, the rendering could be culled due to the ZTest? I don't know how to get around this if it is the problem.

    So there's where I'm at. I'd really appreciate any help or corrections in my thinking. Thank you.
     
  2. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    Someone already find a fix for that i think, Try this LINK
     
  3. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    Hi rea, thanks for the link. Am I reading this correctly or does he seem to be saying that he is doing what I intended to do in my first approach. Does the depth buffer not get written until after the render pass, or am I misunderstanding?

    When I turn zbuffer writing on in the Water4 shader, the refractive effects don't work... I'm trying to figure out why that would be, then.

    Edit: Ok I tracked it down a bit more. The problem comes from the calculation of the "edgeBlendFactors", particularly the x and y components, which control foam falloff and most importantly the edge blend distance. This part of the shader checks the depth texture and takes the difference between its depth and the depth of the current pixel (on the surface of the water). Since the water is rendering to the depth buffer itself, this distance is 0, so the water is rendered as effectively 0 width, hence it doesn't show up.

    I'm open to suggestions on how to fix this...
     
    Last edited: Aug 19, 2014
  4. Reanimate_L

    Reanimate_L

    Joined:
    Oct 10, 2009
    Posts:
    2,788
    Yeah i think he did your first approach, since Opaque object and refraction (via grabPass) are not working.
     
  5. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    Right, so I think that makes sense, you can't both write to the depth buffer and read the depth buffer in the same pass and expect different results since the writing takes place before the reading. I guess that means the replacement shader depth pass takes place before the main rendering pass. I wonder if that's necessary...

    Either way I don't think I can change anything about it, which brings me back to my second idea, or something like it. Two passes, where the first pass doesn't write to the depth buffer but does write to the color buffer, and a second pass which doesn't write to the color buffer but does write to the depth buffer. Is it possible to do that in a shader? I've been trying but can't seem to get it to work to my chagrin.
     
  6. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    Though I haven't figured this out, I feel like I was on the right track with my second approach. Here's what I think should work, and what the results appear to be.

    I have a material of water on a mesh, lets call this the Water material. It renders at some point in the render queue, call it "Queue=Geometry+300". It does not write to the depth buffer, but it does consult it for its depth-based rendering.

    After the water has rendered, I have another material on the same, or a duplicate, mesh, lets call this the WaterSurface material. The WaterSurface material needs to be rendered after the Water material, so we put it at "Queue=Geometry+301" in the render queue. Since the Water material does not write to the depth buffer WaterSurface won't be culled despite being at the same position, so we don't need any special offset or anything.

    Now its just a matter of writing the WaterSurface shader itself. This is very simple since it doesn't write anything to the color buffer, so it's just an empty Pass {} with ZWrite On and ColorMask 0. We also need to include this shader in the camera depth replacement shader, so we override the shader and make an entry with "RenderType=WaterSurface" that takes place at "Geometry+301" as well. Simply, this is implemented the same as the "RenderType=Opaque" shader in the depth pass, however it takes place after all other opaque geometry is done rendering.

    This is what I've done, but something in my thinking or my implementation is wrong. When I look at the depth texture during the Water material's rendering, I can see the WaterSurface depth overwriting the underlying geometry. This isn't supposed to happen until after the Water material has finished rendering ("Queue=Geometry+301"), but it does...

    My only explanation for this is that the depth rendering takes place entirely before the regular rendering of the shaders, not interwoven with it. Consequently, I don't know what else I can try...

    Please help!
     
    Last edited: Aug 19, 2014
  7. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    Still haven't cracked this. Would really appreciate some further insight.
     
  8. spraycanmansam

    spraycanmansam

    Joined:
    Nov 22, 2012
    Posts:
    254
    Hi sgoodrow, I was reading through this and was going to post a link to my post but realised that you had already posted in it! My workaround was to give the ocean a special RenderType called "Ocean". Then, by default, it wont render into the _CameraDepthTexture depth buffer. Then I render a second depth texture with a replacement shader that has a SubShader for the "Ocean" RenderType so that it will be rendered into that depth buffer. Then use _CameraDepthBuffer for any effects that rely on the underwater geometry and the additional depth texture for any effects that rely on whats above, like fog, etc.
    TBH, it's a less than ideal workaround and I plan on giving it another look, but it works for now. In the meantime, if you manage to get something else working effectively I'd love to hear it :)
     
  9. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    Ah, I see, so you construct two depth buffers separately and produce the intended final result yourself. That is a pretty good work around, I suppose, since its only one material that has this problem. Thank you, I appreciate the response greatly. I will certainly return to this thread if another idea strikes me.
     
  10. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    I tried to implement something like this and am running into a strange bit of overhead that I want to get rid of, but can't.

    The problem is that, though my replacement shader only renders the water, it still causes other renderers to execute any CPU culling. Particularly, a terrain object I have is being culled needlessly. I don't know how to get around this.

    I've tried only telling the replacement camera to render the water layer, but doing this messes up the depth texture on the main camera (I don't know why this is, the result is a middle gray value everywhere?). I've tried toggling the terrain iteslf in various ways (change its layer, disable it) but those don't seem to work either (it doesn't get rendered in the main depth pass). Any ideas?

    Also, if the replacement camera renders a replacement shader which renders depths, is it necessary for the replacement render texture to be depth format, and for the depthTextureMode of the replacement camera to be Depth mode? I would think no, but it seems if I change that the pipeline doesn't work again. Were you able to do this somehow?
     
  11. sgoodrow

    sgoodrow

    Joined:
    Sep 28, 2012
    Posts:
    150
    AHA! Ok, I think I figured out the problem. My depth buffer was being cleared on my replacement shader because the WaterTile.cs script was detecting the replacement shader rendering it and turning on the current camera (replacement camera)'s depth texture, which cleared the current depth texture (or something). Disabling/removing the WaterTile.cs script solved the problem.

    This script is usually used for culling/performance benefits, its a system to allow water meshes to be split up into smaller chunks and be enabled/disabled based on viewing. For now I guess I am ok with it being off, but I may need some other technique for performance later if it becomes a problem.