Search Unity

Blended Transparency Shader

Discussion in 'Shaders' started by RyFridge, May 29, 2015.

  1. RyFridge

    RyFridge

    Joined:
    Apr 7, 2014
    Posts:
    52
    Hey there!

    I can for the love of everything not figure out how to solve this presumably easy riddle:

    transparency_blending.jpg

    For a game I need to merge a lot of transparent objects (quads with transparent color as well as quads with alpha textures) into a single image to then overlay them on top of the rest of the game (e.g. red quad).

    These white quads are transparent, but they add up to each other. I need them to be like one piece.

    So far I tried RenderTexture (didn't manage to get it working perfect) and Camera.RenderWithShader (where i can only swap shaders on objects, so i guess it will not help me?).

    I hope I posted in the right section. Every bit of help is very much appreciated!

    Ry
     
  2. AlexBM

    AlexBM

    Joined:
    Mar 26, 2015
    Posts:
    16
    You can try to enable depth testing and writing for these squares, but make sure that they are still rendered after everything on the scene and on top of objects you need to blend.

    Also, if that effect is in 2D game and it's required to be on top of everything, you can render those objects by separate camera in rendertexture with solid-color shader and then blend it with rest of the scene using postprocessing
     
    Last edited: May 29, 2015
  3. RyFridge

    RyFridge

    Joined:
    Apr 7, 2014
    Posts:
    52
    Hi Alex, thanks for the suggestions.
    What do you mean by enabling depth testing?

    I tried that and it worked, but I had 2 problems:
    1. My RenderTexture was always square (not its resolution but the capture region). Could I change this?
    2. I guess the performance would not be good if I had like 5 layers of live rendered RenderTextures on a mobile phone, right?

    Thanks so far!
     
  4. AlexBM

    AlexBM

    Joined:
    Mar 26, 2015
    Posts:
    16
    Considering depth testing I suggest you read articles about it to get a better understanding (google "depth test opengl" or something like that). Setting up depth behavior for objects in Unity is done via some flags in shaders.
    Check this link http://docs.unity3d.com/Manual/SL-CullAndDepth.html
    By default transparent shader has depth testing and writing disabled, so you might need to write your own.

    Considering post-processing approach, there shouldn't be more than one layer. You just render all your objects (transparent white squares on your example) into one render texture using solid-color shader, then pass it to post processing shader and blend with the rest of the scene to achieve transparency effect. It certainly takes some computation power, but there shouldn't be any problem even on middle-end devices. And, considering you don't render big transparent quads on top of everything in that way (quite heavy on mobile platforms) you might even benefit from such approach. Please note, you don't need to user shader replacement feature, just make a separate camera that renders only transparent object via layers preferences.

    Considering square RenderTextures, I suggest you assigning it to camera you want to use in runtime with scripts. It's more easy, because the camera is already internally set up with proper viewport size based on actual screen dimensions, so you'll have proper aspect ratio no mater which dimension you'll specify in rendertexture. Also you can create rendertexture in runtime passing Screen.width and height as its dimensions.

    However, to implement all this stuff you'll probably need to practice more with shaders. Just check Unity documentation on that matter and experiment :)
     
    Last edited: May 30, 2015