Search Unity

_CameraDepthTexture broken in 5.3 **SOLVED**

Discussion in 'General Graphics' started by ruj, Dec 9, 2015.

  1. ruj

    ruj

    Joined:
    Feb 28, 2013
    Posts:
    113
    Our game was depending on _CameraDepthTexture in a shader being the depth of the last rendered camera, but in 5.3 the data I am getting when I sample it makes no sense to me. It has completely broken our post process pipeline, since we save off the depth textures of several cameras via Blits during the OnPreRender() phase, which worked fine until this release.

    I would really like to know if anyone else has seen this, and if they know of any workarounds.
     
  2. AcidArrow

    AcidArrow

    Joined:
    May 20, 2010
    Posts:
    11,791
    I'd suggest bug reporting it. Sounds like a bug to me.
     
  3. ruj

    ruj

    Joined:
    Feb 28, 2013
    Posts:
    113
  4. BrianND

    BrianND

    Joined:
    May 14, 2015
    Posts:
    82
    does it have anything to do with this update
    • Shaders: _CameraDepthTexture is now preserved across calls to RenderWithShader()
     
  5. ruj

    ruj

    Joined:
    Feb 28, 2013
    Posts:
    113
    It does! Unity got back to me. If anyone else runs into this, the fix is that you now use _LastCameraDepthTexture if you want the depth texture of the most recently rendered camera. They say they will be updating the upgrade guide and docs.

    Thanks for everyone who responded.