Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Shader Forge - A visual, node-based shader editor

Discussion in 'Assets and Asset Store' started by Acegikmo, Jan 11, 2014.

  1. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I've considered opening up forums and a shader sharing center on the website when/if people consider it something that's needed! It's definitely possible, but it has to come at a slightly later stage - right now I'm in the middle of the SF plugin itself :)
     
  2. Seith

    Seith

    Joined:
    Nov 3, 2012
    Posts:
    755
    A well-furnished library of example shaders would be a tremendous help indeed.


    I have another question: could you tell me how to get a "swirl" effect on a 2D texture? Basically I would like to distort the UVs slightly. Is there maybe an example in the default SF shaders that I could follow?
     
  3. Chaoss

    Chaoss

    Joined:
    Jul 8, 2011
    Posts:
    327
    Would you be able to include offset in 0.18?
     
  4. joelfivat

    joelfivat

    Joined:
    Aug 14, 2013
    Posts:
    45
  5. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    No, because 0.18 has already been released. I can prioritize it for 0.19 though :)
     
  6. Chaoss

    Chaoss

    Joined:
    Jul 8, 2011
    Posts:
    327
    uuh sorry I was meant to say 0.19 silly me
     
  7. Monkee

    Monkee

    Joined:
    Sep 3, 2012
    Posts:
    8
    Hi dude,

    bought SF and love it to bits! Coming from UDK's material editor this is awesome and I'm so happy to see node tree based plugins are still making progress after Strumpy shader editor isnt being worked on anymore. :-(

    I think you answered this before. Just a simple question, I'm working on opengl es for developement. I noticed I have to have direct X 9 as well as opengles flagged to see the shader to see it in the editor. I havent even taken SF into work yet. With opengl es render emulation it comes out black unless i flag direct x output as well, does this sound correct running on a pc?

    Thanks again mate
     
  8. Seith

    Seith

    Joined:
    Nov 3, 2012
    Posts:
    755
  9. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294

    Yep!
    If you only compile GLES, you won't be able to see it on a desktop as they won't use that render platform.
    You can check all the boxes if you want, and the game will use the appropriate render platform for each device.

    So, just enable DX, it won't affect the build :)
     
  10. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
  11. IzzySoft

    IzzySoft

    Joined:
    Feb 11, 2013
    Posts:
    376
    Is it possible to create a ForceField shader (on a sphere) that shows a ripple eminate from the point of impact?

    ex: http://youtu.be/w4izjZSUpkE?t=16s

    It doesnt have to have a distortion (i have Unity Free). :)
     
  12. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Yeah, but you'll need scripts to specify where and when it hit. I would recommend just using particle effects though, as your shader would only be able to support a specified amount of hits at the same time.


    Yes! Some day :)
     
    Last edited: Jan 16, 2014
  13. Ga2Z

    Ga2Z

    Joined:
    Feb 16, 2012
    Posts:
    68
    Alguien de habla hispana :D !

    También quisiera ver el shader en una escena y si es posible, el node tree o tal vez el tutorial que dices.

    Saludos desde Colombia.
     
  14. KRGraphics

    KRGraphics

    Joined:
    Jan 5, 2010
    Posts:
    4,458
    This is kinda disappointing, especially since I am doing a lot of materials such as fabrics and human skin...
     
  15. bigzer

    bigzer

    Joined:
    May 31, 2011
    Posts:
    160
    Correct me if I'm wrong but the shaders made for forward will still render properly even if you use them with a deferred camera only they will be rendered in forward.

    Right?
     
  16. Don-Gray

    Don-Gray

    Joined:
    Mar 18, 2009
    Posts:
    2,278
    In Deferred limitations you mention Alpha, maybe.
    Is this for now or will this never work, if it doesn't work immediately?
    That would really be a limitation.

    Thanks
     
  17. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Nothing is stopping you from using forward rendering for things that deferred doesn't support, such as transmission/light wrapping :)
    This doesn't mean that you cannot use transmission/light wrap/per-shader custom light/etc anymore in your project, if your project is set to use the deferred render path, so don't panic ;)

    What it means is that *shaders* using deferred lighting won't be able to use it, but you can mix shader lighting types. You can enable deferred rendering and have both deferred shaders and forward shaders in your scene.

    As for alpha, you'll be able to use alpha clipping / cutout alpha in deferred, but semi-transparency will almost certainly not work, you'll need to use forward rendering on those.

    This is going to get a bit technical, so, if you don't want to read it all: Deferred is not a magic bullet, it's a sacrifice in flexibility for gained performance.

    Here are the technical bits:

    The whole idea of deferred rendering is to have a *very* fast lighting pipeline.
    The benefit is that you can have loads of per-pixel lights without much of a performance impact.
    The drawback is that you lose a *tremendous* amount of flexibility in lighting. All deferred shaders have to be affected by lights in the same way, you can't have separate light calcs for separate deferred shaders.



    For instance - if you take a look at how Unity's specular shaders look when using deferred - you'll notice that the specularity isn't colored by the light color.
    Why is the specular not colored by the lights?
    To answer that,




    Technically, here's how the shader paths look in comparison, roughly:

    FORWARD LIGHTING:
    -----------------------------
    Base pass - The base pass calculates pretty much every aspect of your shader, for the directional/primary light source, plus one-time data. This pass includes transmission, custom lighting, emission, ambient passes, vertex offsetting, and so forth.
    Add pass - The add pass is run for every light source affecting this mesh. This pass looks very similar, but it doesn't have emission, ambient lighting, and other non-per-light data. In here, we have access to everything we need to know about the light source! Its position, color, intensity, and so forth.

    + Outline and shadow passes, if used

    DEFERRED LIGHTING:
    ------------------------------
    Base pass - The deferred base pass is very different, and here's where the bottleneck is:
    We need to send everything that you need to use when lighting, into a single, screen-space RGBA texture.
    Here we need to pick the most important things we're going to use that affects lighting, as the space is very limited.
    Unity goes for sending the screen-space normals through the RGB channels, and the glossiness through the A channel.

    This is the point where we lose diffuse power, transmission and light wrapping, because we can't fit that into RGBA.
    It *might* be possible to use RG for the normals and recalculate the Z component of the normals in real-time instead.
    That gives us one more value to work with, which could, in theory, be used for one of the following: monochrome transmission, diffuse power, or monochrome light wrapping. (I would probably go for monochrome transmission in order to get backlight on vegetation)

    When all objects have rendered its normals and glossiness into a texture, a depth texture is also rendered for the next pass which is...

    Global lighting pass - The lighting pass in deferred rendering is run once per frame, instead of once per object per frame! This is where we get our performance gain - the lighting is done in screen space, and it's for all deferred objects at the same time, so there's no overdraw or anything!
    This pass is global though, which means that the lighting is done in the same way for all deferred shaders.
    This pass will then spit out a single, RGBA screen space texture, which is the accumulation of the lighting. Diffuse lighting in RGB, monochrome specular highlights in A.

    This is point where we lose custom lighting and specularity colored by the light color. This is also where we lose the light nodes! In the next pass, all we will have access to in terms of lighting, will be that one RGBA screen-space texture, containing the accumulated lighting.

    Final pass - This is the bulk of the deferred render path, and this is per-shader, unlike the previous pass! This is where we apply the diffuse, specularity, emission, ambient, and everything else non-lightsource related.

    + Outline pass, if used


    This is just the way it is, due to the limited dataset. I've asked Unity about it, and they've said they are working on finding out a more flexible approach to it, but this is what we have to work with now. But it's better than not having deferred support at all, which I hope we all agree!

    Let me know if there's anything else you'd want to know :)
    I hope I cleared up some of the questions you had.
     
  18. Don-Gray

    Don-Gray

    Joined:
    Mar 18, 2009
    Posts:
    2,278
    Thanks for the explanation. :)
     
  19. bigzer

    bigzer

    Joined:
    May 31, 2011
    Posts:
    160
    Thank you for this simple and compact explanation.

    Being able to make this so simple explains why shaderforge is so good :)

    Thanks and keep the great work coming
     
  20. gsokol

    gsokol

    Joined:
    Oct 5, 2010
    Posts:
    76
    Glad to hear its easy to support deferred for the most part. I think its pretty reasonable that alpha blend, transmission etc have to be in forward.
     
  21. YourLover19

    YourLover19

    Joined:
    Jan 16, 2014
    Posts:
    1
    Is there any way to implement in SF the UDK node Depth Biased Alpha?

    Depth Biased Alpha allows to the shader look "softer" or transparent when the edges of the geometry intercept with another geometry.
     
  22. RUBILYN

    RUBILYN

    Joined:
    Jun 22, 2013
    Posts:
    54
    HI EVERYONE !How is the New year STart !?

    DUDES ! I Started a SHADERFORGE USERS SHADERS Library Thread here :

    http: //forum.unity3d.com/threads/222853-THE-SHADERFORGE-USERS-SHADERS-amp-MATERIALS-LiBRARY


    This is Intented to help out ShaderForge Users to have in 1 place all ShaderForge Free Shaders Available here in unity forums. And Also intended to help the Developer to gather in 1 place All Users Shaders So he can Add to its Site Shaders Library !

    Join up And SUbmith your Own *Scrapy* Shaders Creations With us now !

    Regards

    3DLABS
     
    Last edited: Jan 16, 2014
  23. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    It's a planned feature yes :)
     
  24. XilenceX

    XilenceX

    Joined:
    Jun 16, 2013
    Posts:
    122
    I finally bought this yesterday and can't wait to use it in my next project! :)
    I'm a bit worried about the lack of deffered lighting support, because I would like to use Sunshine! in the project as well. Did anyone test if Shaderforge shaders and Sunshine! or Shadow Softener are compatible? I guess they should be once support for the deffered lighting mode arrives.
     
  25. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    For deferred, what if you store a height map instead of normals, then recompute the normals from it in a later pass?
     
  26. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    I'm not sure how that plugin works, so I'm not sure if it will work or not

    That's a pretty good idea, although that would essentially mean that the normals are half as precise as before, which would most likely remove a lot of detail. Sounds like something worth trying though!
     
  27. -JohnMore-

    -JohnMore-

    Joined:
    Jun 16, 2013
    Posts:
    64
    I a total noob in shaders but thanks to Acegikmo and everyone that is sharing helpful tips I made a shader I needed and wanted to share it :) so another noob like me can use it.

    It a simple unlit shader with an outline that increases or decreases its width depending on the distance from the camera, with 0% BaseBorderWidth at Near position, 100% BaseBorderWidth at MaxDistance position, 100%+ from MaxDistancePosition until capped by MaxBorderWidth.

    View attachment $EnemyNormal.shader
    $VariableOutlineShader.jpg
     
  28. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    @imaginaryhuman and Acegikmo:
    Thx to your initial node tree (userecho) and a little research on my own I created a graph that does a pretty decent job at generating a normal from a heightmap. Your initial approach had some issues with my heightmap that is why I tried something else:
    Code (csharp):
    1.  
    2. +N+
    3. WCE
    4. +S+
    5.  
    Based on this pixel-setup I sampled 4 heights with an offset from the center (C) and created 4 normals via crossproduct which in the end got averaged and normalized. The normal map looked pretty convincing to me (based on a simple cloud filter heightmap from PS).
    I can post this tree in the evening when I'm home again.

    But I couldn't plug the heightmap into the Vertex Offset. SF on its default settings creates a shader this way that throws an error saying that tex2Dlod wouldn't be supported in this profile.
    If I remove anything but OpenGL ES 2.0 from the target renderers setting I at least get a black sphere instead of a pink one. But SF throws a warning:
    [SF] Unhandled platform settings. Make sure your build target (StandaloneWindows) is sensible, and that you've got platforms enabled to compile for
    UnityEngine.Debug:LogWarning(Object)
    ShaderForge.SF_StatusBox:GetPrimaryPlatform()
    ShaderForge.SF_StatusBox:UpdateInstructionCount(Shader)
    ShaderForge.SF_Editor:OnShaderEvaluated()
    ShaderForge.SF_Evaluator:SaveShaderAsset()
    ShaderForge.SF_Evaluator:Evaluate()
    ShaderForge.SF_Editor:DrawPreviewPanel(Rect)
    ShaderForge.SF_Editor:OnGUI()
    UnityEditor.DockArea:OnGUI()

    Do you know of any issues with the Vertex Offset in the main node?


    Edit:
    The user was the issue. I simply forgot to multiply by geometry normal before plugging the height info into the vertex offset. Sorry about that.
     
    Last edited: Jan 17, 2014
  29. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    This is a bit of a messy behaviour, because the vertex offset is done on a per-vertex level, and OpenGL/DX handle MIP sampling differently.

    I see you're running Windows, so, if you plug in a value into the texture MIP input (Such as 0), it should work again on DX at least.
     
  30. Molt

    Molt

    Joined:
    Sep 6, 2011
    Posts:
    103
    I've been working on a PBL shader on my Windows machine and have just moved it over to my Mac but it's not allowing me to build an OpenGL version. Is there any way for me to find out more info on why- am I using a non-OpenGL feature, is the shader too complex, or something else?
     
  31. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    1. Are you using Tessellation?
    2. Are you using a texture in vertex offset or outline width?
    3. Is OpenGL checked in the Shader Settings?
     
  32. Molt

    Molt

    Joined:
    Sep 6, 2011
    Posts:
    103
    I'm not using tessellation or outline, and I'm trying to select the 'OpenGL' option in Shader Settings but the option is greyed out. I've attached the shader as it stands at the moment to this, really should have done that first time!
     

    Attached Files:

  33. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Ah, it's because you're using the MIP input of a texture node.

    Currently, you can't use texture MIPs on OpenGL, because it can't handle it. There is a way to solve it though, by adding #pragma glsl, which should make it work, but that removes the ability to view the instruction count.

    That said, I'll prioritize solving this for Beta 0.20 :)
    Beta 0.19 is uploading in about 30 minutes!
     
  34. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Shader Forge Beta 0.19 now released:
    • Added node: Remap Range
    • You can now specify offset factor and offset units
    • Lightmapping and lightprobes are now mutually exclusive
    • Added credits button in the main menu
    • Fixed a bug where you got a compile error when EditorLabel.cs was present when making a build
    • Fixed a bug where the dot product node didn't generate its node preview properly
    • Fixed a bug where using a various characters in property names would break the shader
    • The instruction counter now shows the current render platform (OpenGL / Direct3D 9 / etc.)
    • A bug where shaders didn't save when using the built-in Perforce version control, may have been fixed now

    Click to see all changelogs

    (Don't forget to delete the old Shader Forge before updating to this one!)
     
    Last edited: Jan 17, 2014
  35. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Your link to the remap range node is brokedy ;)

    Hey what's your policy on us being able to sell shaders made with SF, particularly if they keep the metadata so people can reopen them/tie-into and building upon SF usefulness?
     
  36. Becoming

    Becoming

    Joined:
    May 19, 2013
    Posts:
    781
    Awesome, just what i need, hope that will be in soon :D

    I need some help on doing a billboard shader, how is it possible to do that? Also, would that still be possible on a combined mesh? Maybe with some vertexpaint trickery or so? I mean to have individual quads to face the camera, if the mesh is combined...
     
  37. Becoming

    Becoming

    Joined:
    May 19, 2013
    Posts:
    781
    Yeah, i would prefer to be able to sell the shaders made with SF too, of course it does not make much sense to sell simple things but for complex stuff it would be good... i think it would also be better for SF as if we just give away our shaders for free, peolpe could just grab them and many wont bother to buy shaderforge and use the community shaders instead... just a quick thought though.

    Also i think of releasing an environment pack where i would like to include some shaders that i made with shaderforge, the main value would be in the models and textures but the quality and possibilities would be bigger with special shaders.
     
  38. sandboxgod

    sandboxgod

    Joined:
    Sep 27, 2013
    Posts:
    366
    Hm, so looking at Shaderforge ad on the Unity store I can easily make outlines for my character? You know whats funny I just recalled I already have a rim shader I bought off the store... But it would be cool if SF had tutorials that walked me through making one
     
  39. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Plug in a fresnel node into emission and you're done :)
    Multiply it by a color if you want to tweak it
     
  40. sandboxgod

    sandboxgod

    Joined:
    Sep 27, 2013
    Posts:
    366
  41. Eyeofgod

    Eyeofgod

    Joined:
    Jun 25, 2010
    Posts:
    126
    Hi Acegikmo,

    First of all let me tell you that you are doing a SUPERB job! Congratz.

    Just a question, How we can acces UNITY_MATRIX_IT_MV or any other "transformations" defined in the built-in variables?

    Thanks
     
  42. sandboxgod

    sandboxgod

    Joined:
    Sep 27, 2013
    Posts:
    366
    I personally plan to just look at UDK tutorials since ShaderForge looks like it can easily run the average shader they make. The UDK community writes lots of shaders they share freely
     
  43. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    Thanks!
    You should be able to access what you need with the Transform node :)
     
  44. Doddler

    Doddler

    Joined:
    Jul 12, 2011
    Posts:
    269
    Incredible asset, it works very well. I made a set of shaders for my game with relative ease. Nothing fancy, but very easy to do!



    Here's a shader I did to transition a texture from opaque to completely transparent using a grayscale mask. It's a little messy, since I needed to remap a range from 0->1 to (input value)->1. The Remap-Range node is brilliant, but sadly the ability to only specify constants in the range made it a poor fit. I'm certain this isn't the optimal way to write this shader, but I'm happy with the result. :)
     
    Last edited: Jan 17, 2014
  45. Cryunreal

    Cryunreal

    Joined:
    Sep 1, 2013
    Posts:
    9
  46. Acegikmo

    Acegikmo

    Joined:
    Jun 23, 2011
    Posts:
    1,294
    You can actually do that in SF, at least when using Windows :)

    Plug in a slider into the MIP input of a cubemap, and then make it go from 0 to 7, and drag!
     
  47. Cryunreal

    Cryunreal

    Joined:
    Sep 1, 2013
    Posts:
    9
  48. DCrosby

    DCrosby

    Joined:
    Jan 13, 2013
    Posts:
    86
    Having used other shader managers / UE3/UE4 and CryTek/Dev's i'm impressed this makes Unity Finally look like it can compete for more than just mobile development.
    One of the key issues I've seen in UE, and CryTek is that it's not easy to save out presets to say XML or some other format to post shader maps in a forum like this, while shader code is easily copied and pasted, then adjusted. If this is to take off it would be nice to be able to "Post" Basic shader types, or tricks, that then could be pasted into ShaderForge, sort of like Nuke, you can copy pieces of a composite from nuke into an e-mail, send it to a co-worker / friend etc.. and he/she can post it back into Nuke and then connect it to the various inputs diffuse / spec etc.. and get the "look" of the posting.

    Being able to share things easily without having to post a screenshot that may hide various parameters, or explain what's going on to replicate a part of a tree will make this product not just artist firndly, but essential to take advantage of the cool shader examples available online (At some point)
     
  49. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
  50. Seith

    Seith

    Joined:
    Nov 3, 2012
    Posts:
    755
    I agree that the new Remap Range is really useful. But as you said the problem is when you need to make the node's settings modifiable from the inspector. I don't think there's currently a way to do that...