Search Unity

Blend procedural textures using shaders

Discussion in 'Shaders' started by kennung1, Feb 6, 2016.

  1. kennung1

    kennung1

    Joined:
    Dec 28, 2015
    Posts:
    12
    Hi everyone,

    I have a shader that takes two textures as input. The shader itself works fine (it's based on this).
    My problem is, that I don't know how to direct my two procedurally generated textures to the shader.
    I change the texture of the material using mat.mainTexture = myFirstTexture. That becomes accessible within my shader.

    But how to I assign mySecondTexture to the material so that I can acess if from the shader?
    I thought about render passes but I guess it's not possible to switch the texture between each pass. (Or is it?)

    Here a pic to clarify:


    This is the reason why I want to do that:
    I try to drive an autostereoscopic display (lenticular lenses) with a fixed number of sweet spots. So I need multiple views on my scene. Therefore I render it with the appropriate number of cameras representing each sweet spot. Then, I output all views of these cameras into RenderTextures. The next necessary step is to blend the single camera views / renderTextures into one "blend texture" using a special algorithm (that's my goal: run this algorithm in a shader to boost performance). When this image is presented on the monitor it appeares to be stereoscopic.

    (Maybe you have a better idea of how I should do it?)
     
    Last edited: Feb 7, 2016
  2. Teravisor

    Teravisor

    Joined:
    Dec 29, 2014
    Posts:
    654
    Where are your procedural textures stored? Usually if your texture is GPU created, it's in RenderTexture and if your texture is CPU created it's in Texture2D - then just pass it to material.
     
  3. kennung1

    kennung1

    Joined:
    Dec 28, 2015
    Posts:
    12
    The way I had it in the past was: Rendertexture -> download to CPU with ReadPixels (for each RT image) -> pixel mixer algorithm (to generate the new final texture) on CPU -> put final texture on object as new material

    My idea was to use the shader mentionned in my initial posting. But thinking about your comment - I guess with my current shader approach I'm doing the worst possible thing: GPU (RT) -> download to CPU -> Upload to GPU/shader -> download to CPU.

    Here's a snippet of code, just for two textures ("texCenter" and "texLeft")
    Code (CSharp):
    1.     void Update () {
    2. // ....
    3.             // Initialize and render
    4.             logoCamCenter.targetTexture = rt;
    5.             logoCamCenter.Render();
    6.             RenderTexture.active = rt;
    7.             // Read pixels
    8.             texCenter.ReadPixels(new Rect(0, 0, resX, resY), 0, 0); // this is very slow -> avoid it somehow?
    9.             // Clean up
    10.             logoCamCenter.targetTexture = null;
    11.  
    12.  
    13.             // Initialize and render
    14.             logoCamLeft.targetTexture = rt;
    15.             logoCamLeft.Render();
    16.             RenderTexture.active = rt;
    17.             // Read pixels
    18.             texLeft.ReadPixels(new Rect(0, 0, resX, resY), 0, 0); // this is very slow -> avoid it somehow?
    19.             // Clean up
    20.             logoCamLeft.targetTexture = null;
    21.  
    22.             // this is way too slow --> run on shader?
    23.             Texture2D mixTex = new Texture2D(resX, resY, TextureFormat.RGB24, false);
    24.             for (int i = 0; i < resX; i++)
    25.             {
    26.                 for (int j = 0; j < resY; j++)
    27.                 {
    28.                     if (i % 2 == 1)
    29.                         mixTex.SetPixel(i, j, texCenter.GetPixel(i, j));
    30.                     else
    31.                         mixTex.SetPixel(i, j, texLeft.GetPixel(i, j));
    32.                 }
    33.             }
    34.  
    35.             if (outputMaterial != null)
    36.             {
    37.                 outputMaterial.mainTexture = mixTex;
    38.             }
    39. // ....
    40.         }
    What I would like to have is: Rendertexture -> *have each of the 5 RT's stay in GPU* -> pixel mixer algorithm in GPU -> output shader result as new material.

    What would be a good approach to achieve this?
     
  4. Teravisor

    Teravisor

    Joined:
    Dec 29, 2014
    Posts:
    654
    Easiest way is to set up camera that renders onto RenderTexture from quad(s) with texture using shader you need. Or if it's simple processing of single material(up to 4 textures) without world data(like object position, etc), there's Graphics.Blit for it. You just need to write a "filter" shader that will change your texture(s) like you need. Pixel mixing can be done in fragment shader.
     
  5. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,352
    Use mat.SetTexture("_MainTex", firstTexture); instead of .mainTexture. You can assign any texture slot you want this way. mat.mainTexture is actually just a shorthand for the same command. Similarly mat.SetTexture("_SecondTex", secondTexture); or any other texture your shader is setup to use.

    Almost anything you can do in c# with textures you can do with shaders, though it will require a slightly different setup and syntax. The bit of code you have now for mixing the textures can be done with something like:

    fixed4 a = tex2D(_MainTex, uv);
    fixed4 b = tex2D(_SecondTex, uv);
    fixed4 mixed = lerp(a, b, floor(fmod(uv.x * _MainTex_TexelSize.z, 2)));
     
  6. kennung1

    kennung1

    Joined:
    Dec 28, 2015
    Posts:
    12
    Awesome guys, thanks a ton! I can now input my images created by numerous RenderTextures into the shader and access their content from there. Within the shader mixing the subpixels works well and fast, it costs next to no performance. Great!!