1. We've introduced thread tags, search within a thread and similar thread search. Read more here.
    Dismiss Notice
  2. Learn how you'll soon be able to publish your games to China in four simple steps with Xiaomi. Sign up now for early access.
    Dismiss Notice
  3. Get further faster with the Unity Plus Accelerator Pack, free for new Unity Plus subscribers for a limited time. Click here for more details.
    Dismiss Notice
  4. We've released our first Timeline Experimental Preview, our new tool for creating cutscenes and more! To check it out click here.
    Dismiss Notice
  5. Unity 5.5 is now released.
    Dismiss Notice
  6. Check out all the fixes for 5.5 in patch releases 1 & 2.
    Dismiss Notice
  7. Unity 5.6 beta is now available for download.
    Dismiss Notice

How can I use a fragment shader as a texture inside a surface shader

Discussion in 'Shaders' started by falldeaf_unity, Mar 15, 2017.

  1. falldeaf_unity

    falldeaf_unity

    Joined:
    Jul 27, 2016
    Posts:
    15
    I'm trying to merge three textures into one in a shader.

    The first texture is an image of an opaque shape with a transparent background
    The second texture is an animated swirl or plasma effect
    The third texture is a filter, anywhere that's white shows the 2nd texture and anywhere that's black shows the first.

    With only a little modification to this 'dualmap' I was able to get this shader to work for those purposes: https://gist.github.com/Opotable/ade3a97fb9112c87da6c

    And I'm currently animating the second texture by using a sprite-sheet and a script that animates tiling and offset parameters.

    However, I'd like to have the effect be dynamically modifiable. After working with a plasma effect fragment shader, I've trying to put together a combined shader that would use the output of the plasma fragment shader as a texture in the 'dualmaps' shader.

    http://pastebin.com/r6EWC5rA

    I'm running into two issues:

    1.) I can't figure out how to get the fragment shader pass to save as a texture usable in the surface shader.
    2.) The plasma vertex shader is working but it's the effect is slightly off, instead of wrapping the texture around an object, it's oriented to the camera space, so that as the object moves around on the screen, it appears to be a window through which you're looking at the flat texture.

    Some other things I've tried -

    I've read through the entire gentle introduction to shaders series: http://www.alanzucconi.com/2015/06/17/surface-shaders-in-unity3d/

    I looked at Grab passes as a way to get the plasma colors to place in the second texture, but it grabs directly from the rendered screen. The effect was such that it took what was on the screen and made it a texture, but as far as I can tell you can't grab from some sort of buffer that fragment shader pass is writing to?

    I'd really appreciate any help or guidance!
     
  2. falldeaf_unity

    falldeaf_unity

    Joined:
    Jul 27, 2016
    Posts:
    15
    After more testing and playing around, I found a partial solution that ultimately isn't more helpful than the tiling animation method but in case it helps someone else: Using a render texture does work. I set up a quad with the material using a 'plasma' fragment shader, then a camera pointing at it, then created a rendertexture, set that cameras rendering target to the new rendertexture, then finally I was able to use that texture as the secondary texture in the 'dualmaps' shader.

    Unfortunately though, while this does achieve the intended effect and allows me to dynamically change the plasma shaders settings at runtime, it limits me to only one effect, and I'll need multiple. It also doesn't seem like a very good way of getting the job done. Hopefully I'm missing something with the shaderlab capability that will work as I had originally hoped.

    If I'm on the wrong path here or if I simply need to do more doc reading I'd appreciate someone pointing me in the right direction, thanks!
     
  3. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    2,438
    There's no reason why your shaders can't be combined into one shader, no need for render textures or multi-pass shaders.

    Presumably you'd be wanting to do the "plasma" in uv space rather than screen space, so just use the IN.uv_MainTex for the x and y instead of fragPos, and I don't know how you want to mix them, but you could do the lerp of the outColor and the tex2D(_MainTex, IN.uv_MainTex) or just add them together and output that.

    Code (CSharp):
    1. void surf(Input IN, inout SurfaceOutput o) {
    2.     float x = IN.uv_MainTex.x;
    3.     float y = IN.uv_MainTex.y;
    4.     float mov0 = x+y+cos(sin(_Time[1]/10)*2.)*100.+sin(x/100.)*1000.;
    5.     float mov1 = y / _ResY / 0.1 + _Time[1];
    6.     float mov2 = x / _ResX / 0.07;
    7.     float c1 = abs(sin(mov1+_Time[1])/2.+mov2/2.-mov1-mov2+_Time[1]);
    8.     float c2 = abs(sin(c1+sin(mov0/1000.+_Time[1])+sin(y/40.+_Time[1])+sin((x+y)/100.)*3.));
    9.     //float c3 = abs(sin(c1+sin(mov0/1000.+_Time[1])+sin(y/40.+_Time[1])+sin((x+y)/100.)*3.));
    10.     float c3 = abs(sin(c2+cos(mov1+mov2+c2)+cos(mov2)+sin(x/1000.)));
    11.     fixed4 plasmaColor = fixed4(c1,c2,c3,1);
    12.  
    13.     fixed4 main = tex2D(_MainTex, IN.uv_MainTex);
    14.     fixed4 filter = tex2D(_FilTex, IN.uv_FilTex);
    15.  
    16.     o.Albedo = lerp(main.rgb, plasmaColor, filter.a);
    17.     o.Alpha = main.a;
    18. }
     
  4. falldeaf_unity

    falldeaf_unity

    Joined:
    Jul 27, 2016
    Posts:
    15
    That did it! Thank you so much for responding, that cleared up a huge misunderstanding on my part. There really was no need to save the output from a single pass and use it in the next one. I did as you said and ripped out the first pass, added the plasma code directly to the surf function and it worked exactly as intended!

    Thank you again for taking the time to help me.