Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Accessing builtin textures from a compute shader

Discussion in 'Shaders' started by Arrqh, Mar 23, 2016.

  1. Arrqh

    Arrqh

    Joined:
    Mar 31, 2015
    Posts:
    12
    I'm dipping my toes into compute shaders for the first time and while I'm able to do some simple processing on textures, I can't seem to find a way to access any of the builtin Unity textures, such as the GBuffer textures or the depth texture. From a normal shader these would be accessed by shader parameters that Unity sets during the render process (_CameraGBufferTexture0 for example) but in a compute shader this doesn't seem to bind to the texture. Is there a way of accessing these inside a compute shader or a way to explicitly set them via a script? Some equivalent to BuiltinRenderTextureType as with a Command Buffers?
     
    _mm_ likes this.
  2. Arrqh

    Arrqh

    Joined:
    Mar 31, 2015
    Posts:
    12
    So far the only way I've figured out how to access these in a compute shader is to use a Command Buffer to copy them into a RenderTexture that I am able to access in script. Like so:

    Code (CSharp):
    1. RenderTexture gbuffer0 = RenderTexture.GetTemporary(src.width, src.height, 0, RenderTextureFormat.ARGB32, RenderTextureReadWrite.Linear);
    2. CommandBuffer commandBuffer = new CommandBuffer();
    3. commandBuffer.Blit(BuiltinRenderTextureType.GBuffer0, gbuffer0);
    4. Graphics.ExecuteCommandBuffer(commandBuffer);
    5.  
    6. computeshader.SetTexture (0, "Input", gbuffer0);
    7. computeshader.SetTexture (0, "Result", rtout0);
    8. computeshader.Dispatch (0, width, height, 1);
    9. RenderTexture.ReleaseTemporary(gbuffer0);
    And this certainly works and can be used to get at any of the built in textures, but it seems like a waste of gpu time since all you're doing is shuffling memory around to get the textures into one you can get a RenderTexture identifier for! Either I'm missing something obvious (which is likely!) or Unity is missing a way of accessing these global textures from a compute shader?
     
  3. Arrqh

    Arrqh

    Joined:
    Mar 31, 2015
    Posts:
    12
    So to update my own thread, there does not seem to be any way of doing this at all without copying the Render Textures around in memory. This is extremely limiting when it comes to using Compute Shaders as part of the render pipeline, especially not having access to the depth texture! I've added a few related suggestions if anyone else would like to see this functionality added.

    https://feedback.unity3d.com/sugges...-builtinrendertexturetypes-to-compute-shaders
    https://feedback.unity3d.com/sugges...der-dispatch-functionality-to-command-buffers
     
  4. _mm_

    _mm_

    Joined:
    Jul 25, 2015
    Posts:
    2
    I have found a way to solve this without copying textures.

    You can get texture objects representing built in textures simply by calling Shader.GetGlobalTexture method:

    Code (CSharp):
    1.         // Get (update only if necessary, check for null)
    2.         Texture depth = Shader.GetGlobalTexture( "_CameraDepthTexture" );
    3.         Texture gBuffer2 = Shader.GetGlobalTexture( "_CameraGBufferTexture2" );
    4.  
    5.         // Set
    6.         computeShader.SetTexture( kernel, "_CameraDepthTexture", depth );
    7.         computeShader.SetTexture( kernel, "_CameraGBufferTexture2", gBuffer2 );
     
  5. Dreamback

    Dreamback

    Joined:
    Jul 29, 2016
    Posts:
    220
    Could you give an example of what you did on the shader side to read the data? I've been trying and failing all day to read the _CameraDepthTexture from my Compute Shader, using Shader.GetGlobalTexture - all I'm reading are zeroes.

    Also, it looks like a possible alternative is the Compute Shader command "SetTextureFromGlobal", although I can't find any reference to anyone actually using it for these special textures (and I get the same result with that one).
     
    Last edited: Aug 9, 2017
  6. Dreamback

    Dreamback

    Joined:
    Jul 29, 2016
    Posts:
    220
    Finally found the trick - I had to use a Texture2D<float4> in the shader. So in the most simple code, I'm doing this in c#:

    Code (CSharp):
    1.         if (outputTex == null)
    2.         {
    3.             outputTex = new RenderTexture(camera.pixelWidth, camera.pixelHeight, 0, RenderTextureFormat.ARGB32, RenderTextureReadWrite.Linear);
    4.             outputTex.enableRandomWrite = true;
    5.             outputTex.Create ();
    6.         }
    7.  
    8.         int kernelHandle = shader.FindKernel("CSMain");
    9.         shader.SetTextureFromGlobal(kernelHandle, "_DepthTexture", "_CameraDepthTexture");
    10.         shader.SetTexture(kernelHandle, "_OutputTexture", outputTex);
    11.         shader.Dispatch(kernelHandle, camera.pixelWidth/32, camera.pixelHeight/32, 1);
    12.  
    and this in the shader:
    Code (CSharp):
    1. Texture2D<float4> _DepthTexture;
    2. RWTexture2D<float4> _OutputTexture;
    3.  
    4. [numthreads(32,32,1)]
    5. void CSMain (uint3 uv : SV_DispatchThreadID)
    6. {
    7.         _OutputTexture[uv.xy] = _DepthTexture[uv.xy];
    8. }
    9.  
    That gets me a texture that's all reds, since the depth is stored in the red channel.
     
  7. Gerard_Slee

    Gerard_Slee

    Joined:
    Apr 3, 2017
    Posts:
    11
    Also useful for others is to access UNITY.cginc and get linear version of the depth.

    #include "UnityCG.cginc"


    Result[uv.xy] = Linear01Depth(_DepthTexture[uv.xyl].r);
     
    jister and hungrybelome like this.
  8. orangetech

    orangetech

    Joined:
    Sep 30, 2017
    Posts:
    50
    SetTextureFromGlobal seem to be failed
    I got this error:
    Code (CSharp):
    1. Compute shader (DepthShader): Property (_DepthTexture) at kernel index (0) is not set
    2. UnityEngine.ComputeShader:Dispatch(Int32, Int32, Int32, Int32)
    3. TestScreenPos:Start() (at Assets/ComputeScreenPos/TestScreenPos.cs:34)
     
    JonRurka and coidevoid like this.
  9. JonRurka

    JonRurka

    Joined:
    Nov 19, 2012
    Posts:
    35
    I ran into this problem today and would like to state the solution to anyone else experiencing this problem. First off, the solution by Dreamback does work, but requires a slight change or else you will get the error experienced by orangetech. Essentially, _CameraDepthTexture has a stride of only a float, and not a float4, so having "Texture2D<float4> _DepthTexture;" in the shader is invalid. To get this to work, you simply need to replace

    Code (CSharp):
    1. Texture2D<float4> _DepthTexture;
    with

    Code (CSharp):
    1. Texture2D<float> _DepthTexture;
     
    CainUnity likes this.
  10. jister

    jister

    Joined:
    Oct 9, 2009
    Posts:
    1,749
    hey @Gerard_Slee can i ask what the l stands for on uv.xyl?
     
  11. atomicjoe

    atomicjoe

    Joined:
    Apr 10, 2013
    Posts:
    1,869
    For what is worth, I'm having no issues using global textures in general from compute shaders in Unity 2020.3.25
    This works as expected without having to use Shader.GetGlobalTexture:

    Code (CSharp):
    1.     uniform Texture2D<half> _CameraDepthTexture;
    2.     uniform Texture2D _CameraGBufferTexture2;
    It works both for Unity and custom global textures.
     
    Noisecrime likes this.
  12. Slaktus

    Slaktus

    Joined:
    Dec 5, 2012
    Posts:
    58
    To add to this ...

    Make sure you've set your camera to render a depth texture! I thought it did that by default, which it doesn't:
    Code (CSharp):
    1. camera.depthTextureMode = DepthTextureMode.Depth;
    While the depth texture will be accessible from _CameraDepthTexture without this part, Unity will complain that _CameraDepthTexture is not initialized if you don't do it
    Code (CSharp):
    1. computeShader.SetTexture(kernel, "_CameraDepthTexture", Shader.GetGlobalTexture("_CameraDepthTexture"));