How to render depth in Unity 3? I just want to have the current depth buffer displayed on a plane, like any other texture. In Unity 2.6 I used to do this: Code (csharp): Shader "Custom/Render depth buffer" { SubShader { Tags { "RenderType"="Opaque" } Pass { Fog { Mode Off } CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" uniform samplerRECT _CameraDepthTexture; struct v2f { float4 vertex : POSITION; float2 texcoord : TEXCOORD0; }; v2f vert( v2f v ) { v2f o; o.vertex = mul(glstate.matrix.mvp, v.vertex); o.texcoord = v.texcoord; return o; } half4 frag(v2f i) : COLOR { return texRECT(_CameraDepthTexture, i.texcoord); } ENDCG } } FallBack Off } And in a script attached to the camera I set " camera.depthTextureMode = DepthTextureMode.Depth;" However in Unity 3, this returns a gray texture. The shader definitely works, although it seems there are no proper depth values in _CameraDepthTexture, or I'm interpreting them incorrectly.
If I switch the camera to deferred rendering it renders fine. Switching it to forward rendering nothing seems to get written to the depth texture. If I activate deferred rendering, hit Play, hit Stop, switch to forward rendering and hit Play I notice my depth texture still works, but doesn't get updated anymore. If I hit Stop, and Play again now the depth texture seems to be cleared and doesn't update.
It should work in both forward deferred rendering (and it certainly does in our test suites). More details please how to reproduce the problem (best file a bug with complete project that reproduces the issue). When I try your shader set camera's depth texture mode from script, it works for me as well. (Unity 3.0.0f1, Mac OS X 10.6.4, GeForce 8600M).
Your shader is working fine for me in Unity 3 f1 for MacOSX (forward and deferred rendering), but I had to modify it for Unity 2.6.1 : Code (csharp): Shader "Custom/Render depth buffer" { SubShader { Tags { "RenderType"="Opaque" } Pass { Fog { Mode Off } CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" uniform float4 _CameraDepthTexture_ST; uniform samplerRECT _CameraDepthTexture; struct v2f { float4 vertex : POSITION; float2 texcoord : TEXCOORD0; }; v2f vert( v2f v ) { v2f o; o.vertex = mul(glstate.matrix.mvp, v.vertex); o.texcoord = TRANSFORM_TEX(v.texcoord, _CameraDepthTexture); return o; } half4 frag(v2f i) : COLOR { return texRECT(_CameraDepthTexture, i.texcoord); } ENDCG } } FallBack Off } I added the TRANSFORM_TEX macro and a new var _CameraDepthTexture_ST to your shader, because in OpenGL, RECT texture coordinates are in pixels : http://unity3d.com/support/documentation/Components/SL-PlatformDifferences.html
Hi, sorry for the slow response. I added an example project and some screenshots to my bug report. It's case #374352. Here's deferred when not playing Here's deferred when playing (only time when its rendered right, and ONLY in scene, and not game view) Here's forward. Either playing or not playing it's the same problem. If you have deferred on, and switch to forward rendering and hit play, the depth texture will render, but wont get updated. (I assume it remained from the deferred rendering and wasn't cleared). But if you stop and play once more, now it will not show it anymore until you go back to deferred. The exact shader code is (different from my first post): Code (csharp): Shader "Custom/Render depth buffer" { SubShader { Tags { "RenderType"="Opaque" } Pass { Fog { Mode Off } CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" sampler2D _CameraDepthTexture; float4 _CameraDepthTexture_ST; struct v2f { float4 vertex : POSITION; float2 texcoord : TEXCOORD0; }; v2f vert( v2f v ) { v2f o; o.vertex = mul(UNITY_MATRIX_MVP, v.vertex); o.texcoord = TRANSFORM_TEX(v.texcoord, _CameraDepthTexture); return o; } half4 frag(v2f i) : COLOR { float depth = tex2D (_CameraDepthTexture, i.texcoord); depth = LinearEyeDepth (depth); return depth; } ENDCG } } FallBack Off }