Search Unity

  1. Unity 6 Preview is now available. To find out what's new, have a look at our Unity 6 Preview blog post.
    Dismiss Notice

NEW - REFRACT 2D - Refractions, Reflections, Image-based Lighting, Image Distortions

Discussion in 'Assets and Asset Store' started by imaginaryhuman, Jul 9, 2013.

  1. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834


    Version 1.2 Now Available and ON SALE,
    50% OFF!

    Refract 2D

    Realtime rendering of refractive, reflective and lit surfaces
    Refractions, Relections, Image-Based 2D Lighting, Image Distortions, Lenses and more!


    -> Buy Now - PRICE REDUCED TO $25 at the Unity Asset Store! <-

    Read the Free Manual

    Features

    * Dynamically adjustable rendering and animation of distorted/bumpy surfaces
    * Not the same as Normal Mapping - takes advantage of a fixed camera viewpoint
    * Realtime combined-surface animation and per-pixel lens control, better than geometry-based distortions
    * Up to 3 textures and 3 distortions combine and animate in realtime with 1 draw call/2 triangles
    * Add to your game a wide variety of awesome animated effects, including (but not limited to) Glass, Bumps, Lenses, Fire, Image distortions, Water, Heat, Ripples, Warps, Shockwaves, Refractive and Reflective text/logos, Plasmas, Ice, Transitions, Environment mapping, Image-based lighting/shadows, and more!
    * Excellent for 2D games and works in 3D too (with some limitations)
    * Use multiple distortion effects in your scene simultaneously - fire, water, heat, etc
    * Animatable transitions between refraction and refraction using the same Distortion Maps
    * Control refraction/reflection on a per-texture and per-Distortion-Map basis
    * Distortion Maps can be used for a variety of purposes and can be hugely over-powered
    * Animate and modify multi-layer image-warping effects in realtime
    * Over 40 highly-optimized, mobile and desktop friendly shaders - Shader Model 2 where possible, also model 3 but only when necessary (e.g. lots of layers)
    * Works in all versions of Unity - Free, Pro, mobile, etc... Unity 3.5 or later
    * Height-Map conversion tool included for creating Distortion Maps in the Unity Editor
    * Over 30-page detailed manual with color images, free to READ RIGHT HERE!
    * For example, refract a background, reflect a foreground, and apply image-based light and shadow in one draw call

    Enjoy no less than 12 hand-crafted, fully animated example scenes (see below to view live) demonstrating various uses and techniques

    Easy Workflow


    Refract 2D features a simple and flexible workflow. The typical workflow is as follows:

    1) Create one or more Height Map textures/background
    2) Process the Height Map textures with the Height Map Tool to produce Distortion Maps
    3) Create a material with one of Refract 2D’s shaders chosen
    4) Assign the Distortion Map texture(s) to the material
    5) Create background, foreground, mid-ground and/or light-map textures
    6) Assign those textures to the material
    7) Adjust the material’s settings to get the effect you want with realtime WYSIWYG feedback



    -> Buy Now - PRICE REDUCED TO $25 at the Unity Asset Store! <-

    Version 1.2 Now Available and ON SALE, 50% OFF!
     

    Attached Files:

    Last edited: Aug 23, 2013
  2. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Challenge your graphics card: 12 LIVE WEB PLAYERS!

    Actually, Refract2D is fast and efficient so these demos should run well on most systems. Simply click an image of interest to go to the web player. Remember to go `full screen` for best performance and effect. Try them at your highest resolution! Although these demos are all full-screen, imagine several effects running in various areas of your game environment at the same time, a little fire here, a waterfall there, etc.

    Fire

    $Screenshot1.jpg

    -> WEB PLAYER 1 - Realtime Animated Fire <-


    The Fire scene shows how it may be possible to animate realistic-looking flames using refractive lenses. The scene uses a single background - an existing image of a real-life flame - and four Distortion Maps. The Distortion Maps do all the work as they scroll up the screen at various speeds. As they move, their distorting qualities combine with each other to produce an overall distortion of the background. This dynamically distorts the image of the flame and results in somewhat realistic fire.


    Ice Flame


    $Screenshot2.jpg

    -> WEB PLAYER 2 - Realtime Animated Ice Flame <-



    The Ice Flame is a nice visual effect
    combining the idea of real-time fire animation with, well, some ice cubes. One of the Distortion Maps remains stationary, comprising the contours of some blocks of ice. With the blue coloring, this gives the flame something of an icy feel. A single background image of a hue-shifted real-life flame is used, which is then distorted in realtime from three combined Distortion Maps. The final result is cool, refreshing effect.


    Stained Glass


    $Screenshot3.jpg

    -> WEB PLAYER 3 - Realtime Animated Stained Glass <-


    The Stained Glass demo portrays a beautiful stained-glass window. The same original image of a stained-glass window was used to generate the actual Distortion Map as to color it. This was then blurred. The Distortion Map was then coupled with the original colored image as a `mid ground` texture. The color image is not distorted by the Distortion Map, but a background Light Map and a background Environment Map are distorted, to give the impression of movement behind the scene. As the window scrolls and the background moves, color from the mid-ground texture colorizes the light coming through the window, showing off the smoothly contoured stained glass.


    Liquid Metal

    The Liquid Metal scene demonstrates realtime bumpy liquid animation in a 3D environment. Although Refract 2D is technically a `flat` system of distortion that doesn’t take perspective or rotation into account, it can still be used quite effectively in various 3D scenarios. Here, dual Distortion Maps move and combine to create a dynamic liquid surface, colored by a background texture and lit by a `light map` texture. Can you tell it’s not perfectly 3D?


     
    Last edited: Aug 22, 2013
  3. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Plasma


    $Screenshot5.jpg

    -> WEB PLAYER 5 - Realtime Refracted Plasma Effect <-


    The Plasma scene puts Distortion Maps to artistic use, combining 3 background textures and 3 Distortion Maps with an abnormally high degree of Refraction. This produces a beautifully artistic plasma effect. To an observer, you probably can’t tell that this is based on lens refraction. The high degree of Refraction pulls pixel colors from a wide range across and outside of the texture bounds, compacting the colored bands into narrow undulating rings of color. Since this is a very input-heavy shader, Shader Model 3 is used, plus the Refraction value of the second background has to be shared with the third, hence there are only 2 Refraction controls.


    Heat Wave


    $Screenshot6.jpg

    -> WEB PLAYER 6 - Realtime Heat Wave <-



    (This effect is subtle, please view the web player) Experience the heat of the desert in this fun heat-wave simulator. Watch those camel-riders wiggle as the shimmering, scorching heat rises up. Is it starting to feel hot in here? The effect is fairly simple - sometimes subtle distortions are useful. A single colored background is distorted by two simple Distortion Maps. With a
    very low degree of Refraction, the result is a slight wiggle, just enough to give the impression of heat distortion on a hot day.


    Frosted Glass


    $Screenshot7.jpg

    -> WEB PLAYER 7 - Realtime Frosted Glass <-



    Sometimes you want highly detailed refractions or reflections, perhaps representing an organic earthy environment, a metal or wood surface, or perhaps a frosted window. The Frosted Glass demo shows the kind of Distortion Map that can be produced when the Smoothing functionality is switched off in the Height Map Tool. By providing a value of 0 for X Smoothing and Y Smoothing, and starting out with a crisp in-focus texture, tiny refractive or reflective details emerge to create that high-detail or frosted look. This demo transitions between colorless glass,which frosts the background, to colored glass, which combines frosting with a glass tint.


    ​Distortion
    $Screenshot8.jpg

    -> WEB PLAYER 8 - Realtime Animated Image Distortions <-



    Okay, so all of the scenes show distortion, but the Distortion scene especially shows how you can transition between different Distortion Maps, transitioning between refractive and reflective effects in realtime. Four Distortion Maps are used, yet at times any of those Maps may be active to varying degrees. The Power value of each is animated to transition from refractive strength to reflective strength, and at times returns to a Power of 0 to disable effects from a given Distortion Map. Sometimes more than one, or up to four Distortion Maps are in effect. You can see how `holding` distortion from one Distortion Map while adding-in distortion from another, dynamically modifies the distortion effect. The distortion itself is distorted!

     
    Last edited: Jul 9, 2013
  4. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Aquarium


    Watch the fish swim in the spherical aquarium. How they got inside, nobody knows. This demo shows an exaggerated Light Map. Refraction of the Light Map is increased three-fold t
    o increase the intensity of the distortion. This creates an increase in the contrast produced by the light shape. Notice in the lightmap that some areas are light and some are dark - the light areas obviously represent light while the dark areas represent shadow. The over-refracted Light Map produces deep bumps on the aquarium surface,which animate between refraction and reflection in realtime. A subtle environment map is also used.


    Text / Logo



    Sometimes when you’ve made a nice logo or show a text message, you want it to reflect an environment, perhaps react to a light map, or distort a background so as to appear like glass. Here, the Text demo shows a large rectangular Distortion Map used in conjunction with a square background texture. Offset and Scale are used to correctly size and animate the text across the colorful background. A high degree of Refraction is used to make the text appear to be highly refractive like a high-powered lens. The text was generated first in a graphics application, then a copy of the layer was made. The copy was blurred heavily and then multiplied by the original text to keep a hard outline edge. This resulted in a smooth interior but defined edges, and coupled with the high degree of Refraction results in beautifully smooth glass text. This is achieved with a single Distortion Map generated from the text Height Map, and a single colored background. It is possible with the right Height Map to give text any kind of contour, bevel, or surface shape.


    Water



    In the Water demo, special techniques are used to animate a 2D water surface, complete with underwater distortions, lighting, and an un-distorted sky texture, all within a single shader. Two Distortion Maps animate the water ripples and surface. Care has to be taken to avoid over-refractive effects since this would result in blobs of water appearing out of thin air. The water surface ripples and animates with an interesting effect, yet does not distort the sky. While this isn’t a complete simulation of interactive water, movement and positioning of distortions near the surface can produce pleasantly rippling water effects.


    The effect is achieved through the use of an alpha channel. The first background includes an alpha channel, with values of 1.0 (or 255) where the water/diver scene is to be shown, and a 0 where sky is to be shown. Provided this texture is used as the first background, and the texture is RGBA32 or ARGB32, the alpha channel can be used by the shader. The light map is distorted by the same two Distortion Maps and, provided the Refraction value is the same for both textures, the water/diver and the lighting will both refract in unison, creating a seamlessly lit water scene. The final touch entails the use of the light map as the third texture, including a strip of white pixels where the sky needs to appear. This is because the values from the light map will be multiplied with the scene - we want the sky to show its own colors so we allow it to be `multiplied by white` in order to retain its full color. The finishing touch is to move the clouds by individually scrolling the sky texture. With some care, realtime caustics effects could be produced on the underwater objects as well.


    3D


    In this 3D interactive scene, you can move around in first-person perspective and examine various Refract 2D effects in realtime. Position yourself at various angles to see how the effect works (or doesn't) in 3D. While Refract 2D is primarily a 2D system, based on flat surfaces, it does have some potential usefulness in 3D scenes, provided the effects meet your requirements in a given situation. Sometimes the effect can create the illusion of 3D, but in certain locations/proximity/angles the effect may be broken. Move around and see for yourself.

    In this demo, the entire ground is covered with a rippling water effect. This uses two Distortion Maps, one main texture and a light map texture. The light map moves across the surface highlighting or shading the bumps and pits of the water ripples. Since this is effectively using the Distortion Map as a Height Map and is texture-based, the quality of the ripples is extremely smooth, producing a high quality watery surface. You may notice however, that the Liquid Metal spheres to the right seem to intersect a completely flat, bump-less ground, because the water really doesn't have any height.

    The demo also features various spheres. Some show frosted glass, some show general animated refraction, and also present are the Aquarium and Liquid Metal spheres. You'll notice that sometimes from a reasonable distance, and given the spheres don't overlap the ground or sky, they seem believable as refracting the background behind them. Up-close the illusion may be broken - this could possibly be remedied using realtime RenderTextures which capture the actual background in screen-space and then warp and map this texture into the distortion shader. You'll also notice that some spheres (Aquarium, Liquid Metal) use light maps... this appears to show the placement or movement of a light source which may or may not seem realistic given the environment.

    Also present on the right-hand wall is a heat-wave effect... given the effect is flush to the wall it looks correct in 3D - ie the distortions of the surface below are sufficiently close to the surface that you can't easily tell it's not three-dimensional, except at acute angles. On the other hand, the same effect seems to form what looks like a solid separate wall off to the left... this should not really be a solid wall, it should be a column of heat that refracts the actual arena wall some distance away, and indeed should refract some of the fire effect. Indeed if you go behind the wall you should be able to view the rest of the scene, distorted through it, but this doesn't happen. Because the distortion is based on a `background texture`, which is just the image of a flat wall segment, it appears to bring the wall close to the surface instead of maintaining its 3D position. This shows how refraction does not always work in 3D (it maybe would with a RenderTexture method). Yet as you approach some of the spheres that are close to the walls, the refraction does seem valid - so it really depends on where and how you would use the effects.

    Finally on the near wall (behind the initial viewpoint) the Text demo scrolls refractive text across the wall. You can get up-close to zoom in or view from angles. You'll notice from more acute angles the text is actually flat and not as rounded as it looks straight-on.... this again is the side-effect of Refract 2D being a flat system. Again in some situations this kind of effect might be exactly what works for you, but in others where the player gets really close to the effect at an angle it may break the illusion of 3D. Note also that the system doesn't really understand drawing order other than normal opaque Z-buffering.

    Hopefully this 3D demo showcases some of the potentials and pitfalls of using Refract 2D in a 3D game. You could definitely use it for certain special effects, in futuristic environments perhaps, for water effects, to produce semi-realistic lighting, or effects that are close to surfaces, but the system isn't designed to `do everything in 3D` as you'd expect. With the use of extra cameras and RenderTextures it's possible to expand the functionality somewhat, for example a sphere may take a snapshot view toward the player and then use this as its background texture, seeming to reflect the player as they approach the sphere.

    Have fun experimenting to find the exact-right effect for your game or app.


    Version 1.2 is NOW AVAILABLE and ON SALE - 50% OFF!

    -> Buy Now - PRICE REDUCED TO $25 at the Unity Asset Store! <-

    Read the Free Manual

     
    Last edited: Aug 23, 2013
  5. RandAlThor

    RandAlThor

    Joined:
    Dec 2, 2007
    Posts:
    1,294
    Sorry to ask but i do not complete understan, is this only for 2D or can i use this in 3D too?

    It is looking great and i hope i can use it in 3D scenes too i.e. when a magician in a rpg do his magic attack :)
     
  6. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Yes, you can use it in 3D in the same way that you can use Normal Maps in 3D. A normal map is basically an image of a flat surface, recording the bumpiness. When you use a normal map on a 3D object, perhaps with perspective also, and a camera that moves in 3D, the surface still renders and looks bumpy. Yet it's not really truly 3D - as the camera begins to view a surface that isn't `straight-on`, like towards the curved sides of a 3D model, the representation of the bumps doesn't really change. You should technically see `the sides of` the bumps as you move the camera around - to a degree you do, and the reflection of light off the surface helps with the 3D illusion. But the 3D elite among us know it is not physically perfect. This is why clever folks came up with methods like Parallax Mapping or Displacement Mapping, which attempts to make the surface look how it `should` look when you are seeing it from a different angle - the bumps then look even more 3D.

    I'm telling you this because the Distortion Maps which Refract2D uses are similar to Normal Maps in this way - you can use them in 3D and get useful effects (see the Liquid Metal demo above for example), and to the untrained eye you may not notice the shape of the bumps doesn't have proper perspective, but for many people it will pass for a bumpy surface. My Distortion Maps also differ from normal maps in that they not only pre-calculate the reflection angle of light bouncing off/refracting through the surface, but they convert that data to 2D `offsets` which are not only quick to render with but you can also combine multiple distortion layers in realtime without much effort. Also I've built-in the ability to `over refract` or `over reflect` by changing the distorting power of the lenses - you can produce some pretty chunky-looking lenses, far removed from a relatively flat surface.

    One caveat of this technique is the way refractions and reflections should behave in 3D, versus how they will behave. A refracted object which is `behind` another object, from the point of view of the viewer, should be the source of refractions, but because Distortion Maps create their effect perpendicular to the surface (ie the surface is flat), they will distort or reflect textures that are perpendicular to the surface, which won't look right in 3D. So you won't see the right background getting refracted, or the right foreground getting reflected. You could work around this by rendering the actual background behind the refractive object into a RenderTexture, somewhat skewed, and then using that texture in the Refract2D shader. But I can't guarantee it's going to look correct. If you don't mind that the effect you get is not totally correct in 3D, you can still make some good use of it in 3D - see the Liquid Metal demo for example - 3D spheres which still look 3D with a reflective looking surface, it's just that the reflections aren't `real` or accurate - but it's not easy to tell, which is good.

    Also one other thing to bear in mind is this system uses its own lighting. It uses a texture which stores an `image of` light, ie a light map or image-based lighting. This light map could be a panorama or a scene or a sky or a cross-section of some light volumes or some glowing laser lines or point lights or whatever you want. So long as you can represent it in a texture it will work. Then the areas of the light map texture which are `light` will illuminate the surface, while darker areas will look more like `shadows`. See the Aquarium demo above for example - it shows deeply pitted surfaces produced by an over-refracted light map. Refract 2D is mainly a 2D system in this way, everything works on a flat plane, and you could render dynamic lights to the light map with a Render Texture, but it therefore will ignore any of Unity's `3D` lights or shadows.

    The only other main point to consider is you must give a `background` to the Refract 2D shader if you want it to do refraction, so that has to be either pre-generated or rendered to a RenderTexture beforehand. In Refract 2D, however, it just regards a Distortion Map as a colorless surface and you can use it for refraction or reflection as you wish, ie think of the color textures as backgrounds or foregrounds depending on whether you're refracting or reflecting that layer.

    For more information take a look at the `Distortion Maps in 3D Scenes` section of the manual.
     
    Last edited: Jul 9, 2013
  7. imtrobin

    imtrobin

    Joined:
    Nov 30, 2009
    Posts:
    1,548
    can u just make a 3d demo?
     
  8. Tapgames

    Tapgames

    Joined:
    Dec 1, 2009
    Posts:
    242
    This looks really awesome! Will pick this up later.

    Btw, is this mobile friendly?
     
  9. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Liquid Metal is a 3D demo... which wraps a distortion around a 3D sphere. It combines two layers of distortion maps in realtime and colors it with a colormap texture and a lightmap texture. It creates a liquid-like surface in 3D. It basically is doing reflections. The scene doesn't have any regular lights but as far as your eye is concerned it looks fairly believable that there is light `somewhere` in the scene. The spheres don't reflect each other, the only reflections are off an environment-map texture which is fixed (though could be a RenderTexture) and which wraps around the surface. It still looks reflective but the reflections are not physically accurate.

    Doing refraction in 3D is harder. Refract 2D assumes that the camera does not need to rotate around the X or Y axis. It can render a surface looking straight at it (like in 2D games). If you render your 3D environment to a RenderTexture for example and then you want to put something `flat` on top of it as an overlay, like a HUD or GUI elements, they will properly refract the pre-rendered background. In this case the background can be moving and have any camera angle. But the only time there are correct results is when the camera rendering the bumpy surface is looking straight at it. Once you wrap a refractive surface around a 3D model you're effectively also wrapping the camera around the model, so it won't be correct from the viewpoint of the main camera. Like I said you could possibly fake it by rendering the background objects to a RenderTexture and then passing that into R2D... but it's not really designed to work in 3D without some hacking. Doing real 3d refractions and reflections is a much much harder problem beyond the scope of what Refract 2D is best at. On the asset store for example there is the hard surface shader pack which has some refraction and reflection support in 3D, but I believe it's geometry based and doesn't have the animated surface capabilities of Refract 2D.
     
    Last edited: Jul 9, 2013
  10. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Thanks for the appreciation! Mobile friendly... yes. All of the shaders are hand-written to be as optimized as possible for mobile. It uses the absolute smallest/fastest data types and uses efficient algorithms. Also the Height Map Tool pre-calculates the angle of reflection/refraction off the surface and turns it into simple X/Y offset data in the Distortion Map - this saves a LOT of realtime calculations. This is made possible by assuming the camera is looking directly at the surface (which is why this is ideal in a 2D situation but not entirely correct in 3D). This also allows multiple Distortion Maps to be combined in realtime very efficiently without multiple layers of complex calculations. Let's say it's as fast as it can possibly be for mobile but depending on complexity the more layers you add the slower it will run on any platform.

    In my tests, with one Distortion Map and one texture, it ran about the same speed as the same scene using Normal Mapping, maybe a little faster. This is mostly because both techniques are reading from 2 textures, and it's the texture reads that govern most of the performance. If you need to add more distortion maps (up to 4 in some cases), or more texture layers (up to 4 in some cases) then obviously it's going to slow down no matter what platform you're on. I'd imagine you can use it fairly extensively on more recent tablets/phones provided you don't get too heavy with it. You should be able to use it at least as much as you would use normal mapping, depending on complexity. Also it depends on how much of the screen you're covering with distortions, if you just have like a waterfall or a fireplace or something covering only 20% of the screen then it's going to run much better than having the entire screen distorted.

    All of the shaders, where possible, are Shader Model 2. The only reason some of the more complex shaders are Shader Model 3 is because Model 2 doesn't support enough inputs or variables. This mainly applies to when you're using 3 Distortion Maps or 3 textures. I'd say overall it's very good for 2D on mobile where you want local effects that don't cover the entire screen. .. but you might be surprised just how much you can do.
     
  11. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    As above... but I forgot to address your question about magic effects. If you can get away with making your magic effect be drawn on a `billboard` facing the camera, and you can draw the background behind it first to a RenderTexture, then for sure you can do whatever you want in 3D. Most particle systems for example use camera-facing billboards so that whichever way you're facing you will see the full surface of the particle instead of a flat edge. If that works for you, then you could draw refractive billboards that warp the background to provide `magic` special effects. One thing to bear in mind is layering - since refraction requires an existing background, you'd have to draw the effects in sorted order from back to front, each time rendering to a RenderTexture, in order to layer them. Alternatively you can try some blend modes .. there some tips at the end of the user manual about how to use blending.
     
  12. imtrobin

    imtrobin

    Joined:
    Nov 30, 2009
    Posts:
    1,548
    You know we don't really do this because unity should have this done internally.
     
  13. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    I don't entirely follow what you're saying... are you referring to RenderTextures? .. within the OpenGL/DX API there is a distinction between a RenderTexture (target) and the backbuffer... the system only ever displays on-screen the contents of the backbuffer, which is somewhat a legacy thing going back many years. When render-to-texture came along it was added as an extra feature, but you still have to copy from the RenderTexture to the backbuffer in order to see anything. It would be pretty cool if Unity were changed to always render everything to a RenderTexture the size of the screen, if it has the same performance, and if that can then be actually viewed/flipped by the graphics card... but I think there are API issues in the way. In some situations though this would give a tremendous speed boost.

    I'm not sure if that's what you were referring to.
     
  14. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Has anyone experienced `all pink` in the example scenes instead of proper materials? So far I had one report of this but am having trouble replicating it.
     
  15. nixter

    nixter

    Joined:
    Mar 17, 2012
    Posts:
    320
    You changed your avatar, imaginaryhuman. Didn't recognize you. :)

    They look fine to me in the web player (Win 7, Firefox, Unity Plugin 4.0.1f2).
     
  16. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    lol yes, needs a refresh once in a while. Thanks for testing the webplayers. They seem to be fine. The issue is Unity editor 4.1.5... Refract2D seems to be totally fine in 3.5.0, 3.5.7 and 4.0, but in 4.1.5 the example scenes get messed up. Still investigating a possible fix.

    [edit].. I'm homing in on the `pink` issue.. seems that on 4.1.5 editor, compilation of the shaders by Unity is running into an issue when trying to make a shader for GLSL, which stops the compilation and breaks the material. I don't know yet if it's a Unity bug but I'm looking at how to maybe recode the shaders a bit to get around it.[/edit]

    [edit2]Figured out a workaround, will update the shaders and resubmit to the store[/edit]
     
    Last edited: Jul 10, 2013
  17. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Version 1.1 has been submitted to the asset store. All shader errors are fixed and all scenes now work in the editor - this only applied to Unity 4.1.5 for some reason. Within a few days the new version should be available for download.
     
  18. Play_Edu

    Play_Edu

    Joined:
    Jun 10, 2012
    Posts:
    722
    nice work
     
  19. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Thanks, I'm pretty happy with how it turned out.
     
  20. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
  21. JohnnyA

    JohnnyA

    Joined:
    Apr 9, 2010
    Posts:
    5,050
    The demos look nice but it would be handy to see some kind of interaction between the shaders and a 3D world controllable by the user. Without being able to move around its really hard to tell whats happening or how useful the shaders would be for something more than overlay over the whole scene texture.

    For example set up the frosted glass with two windows on to a 3D scene, and allow the users to move about to view the scene from different angles through the normal and frosted window. Similarly with the heat wave you could add the heat wave effect above a patch of tarmac and allow the user to move around the desert and view the tarmac from different angles.

    I understand that the shaders are 2D but its unclear how difficult it would be to use them for more than a whole of scene effect. Obviously some work has to go in to it, but if its too much work for you as the creator with your depth understanding then its likely too much for most of users. End result being people aren't sure and don't buy, or people do buy and then are disappointed.

    EDIT: Just read through some of your manual and thats definitely a great discussion of the issues. Really well done! That said I think its still going to be helpful to sales and user satisfaction if you include some more complex sample setups.
     
    Last edited: Jul 13, 2013
  22. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Hi Johnny.. yes thank you for your insights and suggestions. I agree that some interactive scenes would be useful and possibly more interesting. I'd also like to show an example of a game environment and how several effects might be used simultaneously. It would possibly be useful to do a 3D interactive scene where you can see what possible uses there might be, versus some uses that don't work well. I can tell you the frosted 3D glass will not look right because it won't refract the right background area. It's also difficult right now for me to use RenderTextures because I don't have Unity Pro (working on it ;) so I can't easily show methods that might work to make more things possible in 3D. I do have some more new features planned as well which I'd like to include in the next version, so I'll definitely take your advice and work with it.
     
  23. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    I'm posting the section from the manual about 3D here for easy reference while I work on a new example scene.

    5.2 Distortion Maps in 3D Scenes

    Even though the pre-computed camera is fixed and does not take perspective or rotations into account, Distortion Maps can still be used in 3D scenes provided you can accept that the distortion applies against a `flat environment` parallel to the surface of the 3D object. In other words, in 3D, the Distortion Map simply shrink-wraps onto the surface of the object and acts as though the `camera plane` similarly shrink-wraps to match. Applying a Distortion Map shader to a 3D surface will possibly look okay, given that the resulting effect will wrap around the object. However, the effect will be treated as though it were actually `flat` and not bumpy. Perspective and rotation will still be applied to the shader output, which to a degree will make it look 3D. But the individual bumps will not be technically placed or lit correctly given the angle to the real camera. The surface may still look bumpy, however, and it’s possible that there isn’t enough obvious visual error for a viewer to realize the inaccuracies, especially if the surface distortion is animated. A certain amount of natural `compression` occurs at the sharperangled edges of 3D objects, for example at the sides of the sphere below, creating the impression that the effect is truly 3D.

    $Screenshot4.jpg

    Unlike parallax mapping and other methods which give surface features a genuine 3Dperspective viewpoint, Distortion Maps act more like typical normal maps. The difference between them and normal maps, however, is that normal maps can deal with moving light sources from any angle, whereas Distortion Maps model light based on image data and/or light maps. Also Distortion Maps may refract/reflect a significantly wider range of texture data, resulting in bigger bumps. Similar to normal maps, however, Distortion Maps viewed in 3D will only render inside the silhouette of the geometry, so will not create a bumpy profile.

    Another thing to bear in mind is the direction the viewer is looking at the world, and how this influences what would be refracted or reflected. Remember that Refract 2D is essentially a `flat` bump-mapping system and uses a fixed camera view. Therefore for example, if you were looking at a wall segment of a corridor in perspective, a refractive surface should refract light from objects `behind it` in terms of what `behind` means from the viewer’s vantage point, but will instead show refractions perpendicular to the
    surface of the object, off to the side. Some clever use of Render Textures to dynamically capture the background that are actually `behind` the surface could possible alleviate this issue. The case is similar for reflections. Refract 2D is good for `faking` local reflections and refractions in 3D, but not so good for true 3D effects.

    Bear in mind also that a Distortion Map is a texture with a fixed resolution. This may provide perfect pixel-to-texel mapping in 2D, but in a 3D-perspective environment it is possible to move closer to a surface and not experience increased detail. That said, and especially if your textures use bilinear or trilinear filtering, Distortion Map data will be accurately interpolated to still provide high-resolution smooth results even when close to an object. You may be surprised how low-resolution your Distortion Maps can actually be and still maintain high quality in the final output - this is partly due to how a bumpy surface re-samples the texture in realtime, effectively increasing the resolution of the surface.
     
    Last edited: Jul 14, 2013
  24. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    The Distortion Maps in Refract 2D work similarly to Normal Maps, except for a few important differences. Firstly, the camera's position is assumed to be directly facing the surface, and data is then hardwired to work only with this camera view. Secondly, steps taken to bounce/reflect light off/through the surface are performed offline, or `baked`, instead of being performed in realtime in the shader. This removes some burden from the run-time processing and yet still allows the Distortion Maps to be combined and modified. Finally, since the camera is fixed and is assumed to be parallel to the surface, the screen-space location where the light ray hits the screen is hardcoded into an X,Y offset. This offset is used to modify texture coordinates at runtime to distort the textures. Refract 2D can use any Distortion Map for either purpose, to reflect an environment map, or light map, or refract and background.

    $DistortionMapping.jpg

    Distortion Maps can also transition between two different uses - for refraction or for reflection. It all depends on the Refraction adjustment of each texture, or the Power adjustment of each Distortion Map. The effect of Refraction is that the texture is sampled at a higher resolution, leading to a magnification (like a magnifying glass), whereas with reflection the texture is sampled at a lower resolution leading to it seeming to shrink.

    $RefractionVsReflection.jpg
     
    Last edited: Jul 14, 2013
  25. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    I've built a simple little interactive 3D scene for those 3D-curious among you to try out... I need to add it to the Refract2D package as a new example scene, update the documentation and build and deploy the webplayer, which I'll try to do this evening. I'll give a more detailed run-down of what's being demonstrated and how it shows the possibilities/limitations shortly.
     
    Last edited: Jul 15, 2013
  26. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Okay - here is the new 3D demo scene showing some possible uses of Refract2D in a 3D environment, plus some limitations.

    3D


    What you can see here is that, depending on your needs, sometimes Refract2D works well in 3D, and in other situations it doesn't. For example, I personally think the green water effect covering the ground looks really nice - sure you can't go under the water, it doesn't really have depth, it's not real `3D`, and the lighting may not be technically correct, but I think it looks really good. If this fits your 3D game in certain situations, then that's great. The water also shows that you can apply lighting with a light map and make it look reasonably realistic.

    The spheres demonstrate distortion of a wall texture, frosted glass, the aquarium demo and the liquid metal demo. You'll notice that the balls situated near the corners of the arena close to the walls are actually passable as refracting the wall behind them - this illusion seems to make sense so long as you don't get too close - once the balls start to overlap the sky or the water without refracting them, the illusion is broken. Ideally the background behind the spheres should be captured to a RenderTexture, in screen-space, and then have that grab be warped to fit the world-space of the object, but in this demo I haven't tried such measures.

    Also on the right wall you'll notice a heat distortion, it looks reasonable given it's flush against the wall because the flatness of its surface corresponds to the flat surface behind it. But to the left is another instance of the heat distortion, standing out as what looks like a separate wall - really this just demonstrates that the distortion is flat and warps the immediate background texture, which here is distant from the real wall of the arena, so instead you see what looks like a solid wall with heat on it instead of a refraction of the arena walls behind it.

    The text demo scrolls across the near wall, and looks nice from a distance, but up-close or at acute angles you realize it is totally flat, even though it looks 3D when standing back. Again if this works for you in your situation that's great, but if your player is going to get up-close and personal it may break the illusion.

    The liquid metal demo (the set of moving balls) features a moving light source, as a light map, but without this corresponding to some other light source in the scene it doesn't totally make sense. Indeed the balls turn dark at some points and this doesn't seem to match the scene lighting. In some situations this would be fine, and you could fudge it to make it seem as though the objects are lit in a way that they would be lit if reflecting a real 3D light source, but here the illusion is a little broken. Again in certain circumstances this would be acceptable, but not in others - it totally depends on your environment, when and where you use the effect, and how you play to the effect's strengths and hide its weaknesses.

    With some careful choices, there are many many things you could possible use Refract2D for in a 3D game. For example it might be appropriate to simply distort a 2D image for some reason. Or maybe you want a realtime fire effect but it's not something the player can get up close to. Or maybe you have Unity Pro and can play around with RenderTextures to create dynamic distortion maps or perspective/rotation-corrected refractions, etc.

    Hopefully this demo shows you how Refract2D will behave in a 3D game - it has its uses, and some of them are quite fascinating and cool, but it also has its limits.



     
    Last edited: Jul 15, 2013
  27. fgielow

    fgielow

    Joined:
    Jan 3, 2012
    Posts:
    122
    Hello!

    I am searching for a lighting/shading solution.

    I currently have a prototype of what I need here: http://jotun-dev.tumblr.com/post/55114377905/textured-illumination-system-for-the-line-of
    Basically, a shadowing system which covers what the player should not see, based on his Line of Sight.
    This shows very much what I need regarding LoS: http://www.redblobgames.com/articles/visibility/

    However, the hard part is that I want the sprites to also become shadowed as the pixels in them get closer to the area which the player cannot see.
    As in the prototype I have sent, but my prototype is heavy, uses many raycasts, and cannot handle multiple lightning sources right now.

    Do you think Refract2d could help me somehow? Are there examples of lighting/shading in 2d environments?


    Thanks very much!
     
  28. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Hi there.

    Yes, you can do this with Refract 2D.

    When you say `sprite`, are you talking about the wall segments and possibly other enemy sprites? ie what I see in your demo right now is that as the player light-source approaches a wall, some of the light spills onto the surface of the wall ie bleeds into the wall, as if suggesting the camera-facing surface of the wall is not flat but rather has some bumpiness or roundedness to it, ie it can reflect some of the light coming at it from the sides/player? So then if you had moving sprites as well, a sprite wouldn't be just all-lit or all-shadowed, it would have a bumpy surface like a height map coming out of the screen, and at the edge of the sprite it would reflect some light as lit from the side, whereas toward the center of the sprite where it doesn't face the sideways light source so much, it would not get that light, right? And as the light source gets closer to the sprite or wall, the angle between them will adjust somewhat so that a nearer light source `shines down on` more of the surface than when the light source is further way. Am I correct? And you're trying to replace this effect from a ray-cast system to some other faster method that Refract 2D might provide?

    What Refract 2D can do for you, is let you draw your wall pieces using Distortion Map tiles. Instead of just drawing a solid wall sprite texture for example, you draw a sprite using a Refract 2D shader which includes a) the wall tile image itself, and b) a small equally-sized/proportion distortion map representing the shape of the camera-facing surface of the wall - ie the front of the wall and the visible part of its `sides`. If you make those distortion maps represent a curved surface - it could be bumpy like rock or smooth or whatever, or at least that they curve off toward the edges, then this would represent a piece of wall that isn't just totally flat on the front side. So then when light comes at the wall strictly from the side only (like you have now), at least some of the pixels near the edge should reflect light into the camera because they are at an angle rather than flat. In other words yes this would give the wall surface a bump mapping effect. And as your light source moves closer to the wall or sprite object it will light up more of the sprite/wall due to the angle.

    You could do this in Refract 2D by having your light source rendered to a texture. Refract 2D requires and needs `the background` to be fed in via a texture. For you this either means pre-rendering that texture or generating it in realtime. To do that would probably mean rendering to a RenderTexture - I don't know if you have Unity Pro but you'd need that to make it work, at least for multiple moving lights - for fixed lights or a single light you can pre-render. Then as you scroll through the environment, you first render the `light` to a texture. Then you render the tiles using distortion maps + texture color. Your `light map` texture, for example, would contain a large sphere of light fading off the further you get from the center, like you have now centered on the player. In fact if your scrolling is always centered on/attached to the player then you only need to store this in a pre-rendered texture and not generate it at runtime. This light would then illuminate your wall element and enemy sprites and bumpmap them properly - ie they will reflect the light map based on their surface shape. If you prefer then you can render the `visible` area inside the walls, and not including the walls, using your own method, and just use Refract 2D for the walls and sprites.

    Fixed walls are easy because they dont have a moving background behind them, but for moving sprites that move over a fixed background, that background has to be fed into the shader. This may mean you have to use a RenderTexture to first render your visible area background, excluding or including walls, and then feed this texture into the shader. Using UV coordinates in your geometry, which you'd have to update every frame as your sprite moves, you can pull up the appropriate piece of background texture and then put your distortion map on it + your sprite image. You can use the same background texture for all on-screen sprites, they just need their own dynamic UV coords and appropriate adjustment of the background position/scale... which suggests you probably will end up with one draw call per moving object because they'd have to have their own material instance. It's good that you raise this question really because you're giving me insights into what areas Refract 2D needs to grow and improve (e.g. passing UV2 coords in to offset the background so that a single material can be used). Or you could modify the shaders yourself to do this.

    So yeah, what you're trying to do is a good fit for Refract 2D's abilities, this is the kind of thing it is designed to be good at - reflections and refractions in 2D games. Doing reflections in a flat 2D game suggests, as you're trying to do, that light comes from the sides and should illuminate the edges/slopes of the environment objects. Refract 2D can do this. You just then need to combine the Refract 2D objects (wall tiles, sprites) with the drawing of your main background and your application of shadowing. You could use Refract 2D to do your main visible non-shadowed area, as well, if you e.g. want your background to show some bump mapping. You just then have to layer your shadows on top. You might find that sprites behind obstructed walls might pick up some light pixels from a large light source btw... because a bump 5 inches away on-screen may still point to a light source on the other side of a wall. So you may need to experiment with maybe rendering the light/shadows to the light map, so that when a wall obstructs a light it totally blocks it off from further-away objects. I'm not entirely sure how that will play out for you but there should be some way to make it look right.

    You asked also about colored lights or multiple lights. Using the light map you can simply render as many light sources and colors to it as you want. You could have thousands of them, fill-rate permitting. You could have round lights, square lights, shafts, cones, laser-beams, glowing shapes, etc.. really since it's per-pixel you can design pretty much any shape of light source you want. You just need to make sure the light spreads out over space, ie blurs/has falloff based on distance, otherwise you will literally get a hard-edged light - which you might want. You can put whatever colors you like in the light map. Use perhaps an additive blend to draw to it. I don't know exactly how you're doing your shadows but you could also render multiple shadow-producing light sources on top of each other.

    Also don't forget that in addition to reflection, which is what you'd be using Refract 2D for mainly, it also does refraction, which allows you to have semi-transparent distortion effects on top of a background, so this could add some interesting new features to your game level such as a distorting waterfall you walk behind, or a pane of frosted glass you walk behind, or some heat that you walk past, or depending on what your game is about, maybe the player walks into a teleporter or scanner or something that distorts the sprite in a nice way to represent the teleportation effect. You could also have glass walls which distort a background and still reflect the light map to show a bumpy surface.

    You also asked about a 2d environment demo. Essentially almost all of my demos are 2d environment demos, but the 2d ones mainly show refraction. They may be full-screen for the most part but that's just to show off the effect. But for example the Aquarium demo has a light map which is reflected, similar to what you'd do. Also the Liquid Metal demo does reflected light. It is lit using a light map texture, so as the light passes over the surface it reflects at the appropriate angles. I don't have an actual interactive game environment to play in, yet... but it's on my todo list. I can see the need to show the light mapping better and to show how it looks with reflection in a 2d game level, so I'm working on that.

    Btw you can use spritesheets to store your tile images and the distortion maps that they need to use, also for your moving sprites and their distortion maps. You just need the right UV coords, then the draw calls will combine better.

    And btw one small thing you have to keep in mind with light mapping in the way I've suggested, is that a bumpy surface will want to pull pixels from a texture based on the bump surface angle, and its possible (and likely) that those pixels will be `off screen`. e.g. a bump near the edge of the screen pointing outside the screen needs to get light still, but the light map texture, if it's the size of the screen, will then wrap or clamp the other side of the light map. So you really need to use either a double-sized light map that can span 4 screens and be centered over the screen, or simply render a screen-sized lightmap and then let it be scaled up 2x. It'll use bilinear filtering so for light, which is usually fairly smooth, it'll probably not be noticeable. In fact you may be surprised how low-res your lighting can get and still look good so you may only need to render to a low-res light map.

    Let me know if you have more questions or need more clarity.
     
    Last edited: Jul 16, 2013
  29. fgielow

    fgielow

    Joined:
    Jan 3, 2012
    Posts:
    122
    Thanks very much for the extensive answer!

    To be honest though, although I got a major glimpse on how to proceed, I am not experienced with shaders programming and hence I did not comprehend some details. I've manipulated some shaders a little, but many times not knowing precisely well what I was doing and it seems I should have a little more knowledge regarding them to take the best benefict from Refract2D. Thus, I already got some video lessons I intended to study before (but did not manage to); I think I will give them a try now: http://cgcookie.com/unity/cgc-courses/noob-to-pro-shader-writing-for-unity-4-beginner/ before trying Refract2D - it would probably help later when I start using Refract2D, right?

    As a side note, I currently do not own unity3d PRO, but it seems that there are other ways Refract2D could help me, still. Just to specify better my scenario, I am currently working with 2dtoolkit to generate my scenario, which consists of only flat sprites. The shadows projected from the player's light would be projected only considering the scenario, which is static, bleeding into the scenario sprites suggesting volume in the 2d sprite indeed (but not allowing the light to pass through it, keeping the other side dark, from the player's LoS perspective). I managed to get native unity3d lighting system working with reasonable results, but the issue is really that the light should not pass through my sprites, which do not have volume to restrict the light emission. I probably would achieve better results with the proper use of Refract2D, it seems.

    And you were correct, I intend to replace my effect from raycasts with some faster methods. I considered keeping a structure with the vertices and line segments from my scenario sprites in an approach such as: http://www.redblobgames.com/articles/visibility/ in order to optimize the operation, but the effect of bleeding the light into my sprites without allowing the light to trespass it would still be complicated and maybe inefficient; it seems that shaders approaches ought to be more appropriate for what I need, specially from a performance perspective.
     
  30. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Hey, yeah, you might want to go back and reread what I posted because I've been sitting here making several changes and amendments so you might not have read all of what I wrote. Or maybe you did.

    I don't know that you really need to know shaders in order to use Refract2D... the only tricky bit perhaps is if you needed custom UV's for the sprites. But since you're using 2dtoolkit, it's going to generate the UV's for you, at least as far as drawing the sprite color goes. It's just that you'll need to also adjust the background position/scale to match. However, if you're just talking about walls, then it's easy - just make sure your distortion map sprite sheet is the same size as your color sprite sheet and the UV's will match fine. It's in the area of moving sprites that things get a bit more involved.

    Given you're having to do many many raycasts to do your system at the moment, you should find that Refract 2D is faster overall, it's totally handled by the graphics hardware. To make sure light doesn't pass through any walls you just would have to rely on your (or some other) shadow technique, ie make sure it creates shadows behind all ways that go all the way to the edge of the screen. Then this lighting should stop anything from showing up there. Do you have an example of how the light is seeping past the walls onto other sprites?
     
  31. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    As a side note, you will still need to do ray casting if you want to do proper 2d shadows that are fully obstructed by a wall. You'll need to trace up to the first wall pixel that aims away from the light. But getting rid if your rat casts entirely and using R2D for shadows will only work to an extent.
     
  32. fgielow

    fgielow

    Joined:
    Jan 3, 2012
    Posts:
    122
    Here is a link to a video which employs unity lights and lit shaders: https://dl.dropboxusercontent.com/u/12198256/jotunLightSystemSpec.mp4
    It shows how unity's light seeps past the wall - which is to be expected actually, given unity 3d's behavior, while I need a 2d only approach.
    note: botters = bothers; I had already exported the video, noticed the mistake only after :p

    I may try something hybrid with a structure to keep the scenario's line segments in order to reduce the number of raycasts, something in the lines of http://www.redblobgames.com/articles/visibility/.

    Thanks very much!
     
  33. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Ahh yes, thanks for sharing that. Yeah, with Refract 2D you can do some lighting but you can't do the line-of-sight type of shadows, or at least it's not obvious to me how that would work. I'll have to ponder it some more to see if there might be some workaround. It does usually need some kind of `shadow casting` system. The difficult with shadows is that for every light source it's possible to cast shadows over the whole screen, and therefore effectively every pixel on the screen has to calculate whether it can see the light. That is made more efficient by calculating whole triangles/meshes that extrude from geometry, but still not as efficient as it might be, as you know. However, I've come up with a new technique which I believe to be original, which gets rid of the ray casts completely and is much more efficient... but I'm still working on that new asset and might not be done with it for a few weeks yet.
     
  34. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Light and Shadows with Refract 2D

    Here I will provide some additional clarity about how to use lighting and shadows in Refract 2D, what you can do with it and how it works. This is covered to some extent in the manual but I'd like to elaborate. :)

    Refract 2D uses a `light map` texture to implement lighting. Light is stored in the texture, in any color or shape, and is multiplied by the rest of the output. A light map is effectively a cross section of the light being emitted near to the surface, sampled on a per-pixel basis. So a slice is taken cutting through the light parallel to the surface, and is stored in the texture. It is similar to a single slice of a brain scan which shows a 2D view of a slice of the brain, except we're taking a 2D view of a slice of volumetric light. This shows a cross-section of the light, and therefore also the lack of light, which is shade. Of course white is bright light while black is no light, and lights can be any color on a per-pixel basis, which means you can have as many visible lights as there are pixels!

    $Light Map.jpg
    A simple mostly-diffuse light map with white light​

    Since the light map is just a texture, you can store absolutely any image in it and use that as `light`. This could be a panorama of a scene, providing image-based lighting. Perhaps this image might be highly blurred to soften the general surrounding light. It could be something you rendered, an image of an environment (as in environment mapping), a collection of realtime generated light sources, some other pre-drawn image, an animated `light sprite`, `light tiles`, or perhaps something being emitted by nearby surfaces.

    Although in 3D digital environments you might have become accustomed to thinking of light as something different from a texture, e.g. a realtime-generated point light based on a math function, in the real world when a surface is at all shiny (ie reflects or emits some light and doesn't absorb it all) it simply reflects what's in the surrounding environment. So real-world lighting really is the same thing as environment mapping, which is basically reflection + emission. When you see what people refer to as a `specular highlight` for example, it merely means that somewhere in the surrounding environment there is a small, limited-sized light source like a light bulb, and it shows up in the surface of the object as a small dot. Often this bright spot stands out from the rest of the reflections as a highlight, which is why they're called specular highlights. But in addition to the specular highlights are all kinds of subtler and perhaps less obvious or dimmer reflections of light coming from other surrounding objects, even if those objects are only passing on light that they received from somewhere else. So ultimately, lighting is just environmental reflections and ideally every object in your environment should reflect its environment. This is why cube-mapping and image-based-lighting for example are popular as ways to model environmental light in 3D.

    So in Refract 2D, mathematical lights that are limited to approximating diffuse and specular light from point lights are thrown out, to be replaced with per-pixel light information stored in a light map texture. This provides far more control over the color and position and intensity of light across the whole screen on a per-pixel basis. To do something similar in 3D would require massive volumetric 3D textures. 2D rocks! A single white pixel in the light map represents a tiny point of light in the environment. Or to put it another way, that single white pixel could be a reflection of a small LED light bulb in an otherwise black room, for example. When you think of your Distortion Map surface as shiny and reflective, and you think of lighting as simply environmental reflections, the Distortion Map basically reflects the light map texture, using it as a picture of the environmental light.

    So a single pixel in the light map will be reflected by the bumpy surface and show up as a tiny dot, seeming to light the surface in one small spot but nowhere else. All other pixels will be output as black because all other pixels in the light map would be black. As the light source/texture moves, or as the surface moves, that dot of light will dance around according to the surface distortion/bumps. You might think of this one-pixel light as showing up in a similar way to specular highlights. It might represent the brightest point at the center of a light source. It also suggests that the light source represented by that one pixel is `one pixel wide`.

    $ShadowSharp.jpg
    Hard-edged light produced by a hard-edged circle in the light map (e.g. a spot light)​

    As we increase the size of our light source, we could make it a larger circle of white pixels. At the edge the circle immediately cuts to black with a hard edge. This could perhaps represent a spot light, a flashlight, or a ball of light which is equally bright all over. This will be reflected by the surface, but you will then notice that the hard edges of the circle show up as hard edges of reflection. The surface will simply reflect what it sees in the light map. You may or may not want such a hard-edged effect - it's almost like a very large specular highlight from a uniformly outputting light source.

    Sometimes you will want to have very crisp reflections like this, since you might have an image of an environment in your light map texture and want to see the fine details. Often though when you think of lighting you often think of diffuse light, where the rays of light might have spread out as a result of bumps in the surface of the light source/environment, or perhaps where diffuse light has been created by being bounced off other objects. For example radiosity or ambient occlusion may have occurred in some areas. You might also want your light to be generally softer. In this case, all that is required is to blur the light map. This softens the light and disperses it. Basically what a blur does is, takes the light from each individual pixel and spreads it out to the surrounding pixels. So this `diffuses` the light. Blurring the whole light map fairly heavily provides what will look to be diffuse lighting.

    $ShadowSoft.jpg
    Soft-diffuse light produced by a soft-edged circle in the light map (e.g. a point light)​

    What you probably want to achieve in most cases is something resembling point lights. A point light basically has some kind of central source and then light radiates from it and diminishes over a distance. At some distance the light becomes so dim that it no longer lights objects, having been scattered by collisions with air particles, dust or fog for example. This can be achieved by putting a local light source into the light map - ie, basically our one pixel, or our filled circle, or a blurred circle, or some other shape that isn't the same across the entire light map. All these shapes, indeed any shape, can be used as `local` light sources. They don't necessarily have to be `points` because we can manage the shape of the light source on a per-pixel basis. So you could have square lights, ring lights, lighted wavy lines, randomly scattered point lights, oval lights, laser beam lights, triangular lights, light shafts, etc. If you can draw it to a texture, you can use it as light sources. Again, 2D rocks!

    Ambient light can be achieved simply by making sure that all of the pixels in the light map texture have at least some brightness to them. Indeed you can control the ambient light on a per-pixel basis, plus the ambient light can be any color on a per-pixel basis. Ambient light, point lights, cone lines, whatever-shape-lights can all be blended on top of each other into a single light map. Even an image of a household object could be a light source - your cat or dog, or your best friend's face. Whatever you want. This is why image-based lighting is so powerful and versatile - it simply goes back to the fact that lighting is all about either direct emission of light or reflection of emitted light off of other objects.

    $ShadowDiffuseSpecular.jpg
    Diffuse + Specular lighting using a combination of soft-edged and hard-edged circles in the light map​

    You can combine types of lights in a light map, given it is pixel-based, using whatever graphics software you have at hand. For example you can create diffuse light by blurring your light sources, then overlay the original crisp un-blurred light sources on top. This gives you diffuse + specular lighting as above. Specular light is simply highly concentrated `sharp` light while diffuse light is simply `blurred` light. There's nothing to stop you from having any number of levels of specularity in the same light map - if you can draw it you can use it. Perhaps then add-in some ambient light and whatever else you want to do to it and you're good to go. You could even get really advanced and use some super-sophisticated lighting model. Or perhaps take a photo of a panorama and blur it to utilize beautifully natural environmental light. And don't forget that the light map is an RGB color texture, so you can use multi-colored lights as much as you like!

    Light can also be animated. It's just a texture. Using RenderTextures for example you can render realtime light animations to it and watch those lights move in realtime. A video sequence could even be a light map, or a realtime-generated effect. It is limited only by your imagination ;)

    A useful technique in a 2D game environment is to have a light map that is about 2x the width and height of the screen. Bumps near the edge of the screen that point outward, as if reflecting light from outside of the edges of the screen, need to still refer to light sources that are `off-screen`. So by maintaining a larger light map texture centered over the scene you can accommodate this without seeing wrap-around or clipped light reflections. In fact, because Refract 2D resamples textures based on the surface and this can act to increase the effective resolution of the texture in realtime, you can actually get away with a much smaller light map texture than you'd think you might need. Especially if most of your lighting is fairly diffuse ie blurred, there is little need to store it in a high resolution texture. Smoothly highly-blurred lights for example could be stored in a significantly smaller texture - bilinear or trilinear filtering on the light map texture's import settings will deal with smoothly interpolating the light for you.

    With this in mind, it is possible to store the diffuse light for a fairly large game environment in a single large texture. By scaling the light map up and relying on smooth interpolation, the light map can remain static and cover a large scrollable area covering many screens. This is one way to compensate for the absence of RenderTextures in Unity Free, allowing you to light a good-sized 2d environment without having to re-generate the light map on the fly. However, once you do gain access to RenderTextures there are many interesting possibilities that open up including animation, higher-resolution scrolling light maps, and the ability to only render the visible lights to the light map each frame.

    Now, I've spoken about lights but I also need to talk about shadows. Refract 2D does feature some shadowing ability. It is not a complete, or totally accurate shadowing solution, but it can prove useful. The darker areas of a light map effectively tell the Distortion Map where there is `no light`, or less light. In the case of a hard-edged circle for example, the areas outside the circle are not emitting any light so are dark. You could say that the dark areas are shadows. When you apply this light map to a bumpy surface you will see that pixels closer to the circle receive white light, while reflections that land outside the circle receive black pixels. For example on the inside of a crater or `pit` in the surface, the edge pointing away from the circle will likely pick up black pixels, while pixels pointing more towards the circle will pick up white pixels. This creates the impression that the edges of the surface that are `pointing away from light sources` receive shade, while the surface that `points toward light sources` receives light. This can create a fairly convincing illusion that the surface is rendering shadows, based on the simple fact that the light map texture has some areas that are lit and some that are shaded.

    The surface is not technically self-shadowing. It does not pay attention to whether a given slope can actually see the light source pixel it points at, unobstructed by other bumps. This is one flaw in the realism. However, you will find that, similar to the sun, the further away from the center of a light source the surface pixel gets the `longer` the `shadows` get. This is because the angle to the light is becoming flatter. Just as the late afternoon/evening sun casts long shadows, so too will it seem, to a degree, that further-away bumps in the surface shadow more. So for a `point light` in the light map texture, it will behave somewhat like Unity's point lights where surfaces further from the center of the light are darker. It will seem that the light source is a short distance above the surface rather than completely flush to the surface, which is why a) this does not cast lines of visibility that totally obscure `bumps behind bumps`, and b) `bumps behind bumps` may still be lit (albeit less) because they still point to enough of the light source to receive some light. Typically with more blurred/diffuse light maps this is more noticeable, because the larger the area covered by the light in the light map the further it will `reach`. Using soft blurred diffuse lights generally creates what looks like fairly realistic `light and shade`, but does not cast shadows from one bump onto another. So a Distortion Map surface is `shadowing`, but not `self-shadowing`. Self-shadowing would be required in order to cast obstructive shadows. I'm investigating self-shadowing functionality for a future version of Refract 2D.

    $ColoredLightShade.jpg
    Colored light with varying degrees of specularity/diffusion and shape​

    So you can emulate pretty much any kind of light in Refract 2D and being able to modify the light on a per-pixel basis is a great boost for creative options, subtle quality and control.

    One final tip - decreasing the Refraction control on the Refract 2D shaders into a negative number converts it from refractive to reflective. `Over-reflecting` a light map amplifies the contrast, for example with `-3 Refraction`, brings out even more exaggerated light and shadow.

    Hopefully this has helped to present the lighting system in ... a better light ;) ... Feel free to ask any questions.
     
    Last edited: Jul 17, 2013
  35. toto2003

    toto2003

    Joined:
    Sep 22, 2010
    Posts:
    528
    hello
    i just check your shaders, it s really impressive, i wonder if you can pull out this kind of shader in your pack, i m hungry about having a watercolor shader like this
    http://artis.imag.fr/Publications/20...watercolor.pdf

    would it be too much work?

    thanks
     
  36. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Your link is a bit broken because you have some text replaced with .. the real link is http://artis.imag.fr/Publications/2006/BKTS06/watercolor.pdf

    The watercolor rendering effect is interesting, I like non-photo-realistic art stuff and I might have a go at this sometime, it looks like it would be fairly easy to do in a shader... but at the moment I have 3 or 4 projects that I need to attend to.
     
  37. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Refract 2D could use some reviews on the asset store... if you've purchased or are thinking to do so, please take a minute or two to post a short review on the asset store page - I really appreciate it a lot!

    Thanks
     
  38. fgielow

    fgielow

    Joined:
    Jan 3, 2012
    Posts:
    122
    Thanks for the extensive post regarding Lights and Shadows!
     
  39. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Sure, it'll be going into an improved Manual in the next version, but I thought it'd be good to share it up front. It's been interesting for me developing this tool because in some ways I'm still discovering what it can do!
     
  40. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    If anyone has any feature requests for a future version feel free to shout out.

    Also you may not know but you can hand-create distortion maps if you know what you're doing. The Red component is the X offset and the Green component is the Y offset, measured in pixels. So you can e.g. pull up pixels at a specific coordinate offset from the current pixel by filling in the appropriate color values. You could computer-generate a distortion map with whatever offsets you want to use. The shaders are essentially just displacements to UV coordinates using these offsets, so you can create some pretty cool effects with this.
     
    Last edited: Jul 23, 2013
  41. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    I'm currently reconsidering pricing on this asset. Unity asset store staff suggested that $50 seemed like a reasonable price based on time, effort, skill involved in making this asset. However, as we know sometimes pricing is relative and lower prices may mean more people can buy. I don't know what the sweet spot is, but I'd like to hear your opinions. Is 50 too high, too low? Is 20, 30 or 40 more palatable?
     
  42. EmeralLotus

    EmeralLotus

    Joined:
    Aug 10, 2012
    Posts:
    1,462
    Hi ImaginaryHuman,

    Very interesting work. In regards to pricing, I think when launching a new product, it's good to price at a discount. This way there's less barrier to entry and encourage people to try the product. After the promotion period, then the price can go back to normal. In this case, the promotion could be $25 and then up to $40-$50. After that once in a while it could be 50% sale. Just my 2cents.

    Ok. back to techie stuff.
    I'm not a shader expert by any means by I do have Unity Pro and interested in getting the most performance from shaders and graphics effects. You mentioned a number of times the use of RenderTexture of Pro increases the performance. Is this package now optimized for Pro ?

    P.S. are the other packages that you made also optimized for Pro?

    Is it possible to interact with the refraction.

    Thanks.
     
    Last edited: Jul 28, 2013
  43. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Hi Rocki.

    Thanks for the feedback on the price. There will be an official 50%-off sale sometime when Unity decides to put it up, but I am thinking about a general lowering of price - but that said there have already been a fair number of customers spending $50.

    Techie stuff.. Refract 2D generally does not need any render textures and doesn't need Unity Pro. It also doesn't run any faster whatsoever by rendering to a render texture instead of to the normal background. The area where speed can be gained is when you try to modify some data in realtime, which would be like generating a realtime Distortion Map texture, or maybe modifying a light map texture, or an environment map. To create dynamic textures like that is really not efficient in Unity Free because all you have is a backbuffer grab-to-texture which is relatively very slow. When you render straight to a texture you don't have to do any grab at all, which is how render textures can be faster. So if you're wanting to do something very dynamic like modifying the textures you use in realtime, maybe moving multiple lights around in the light map or modifying distortions using some rendering method you came up with (e.g. tex coord displacement, realtime blurs, or whatever), then indeed render textures are practically a necessity. These are just extra ways you can use Refract 2D that are up to your creativity.

    In terms of speed, the shaders are all optimized as much as possible. They used `fixed` data types almost entirely to maximize speed on mobile which also helps on desktop in many cases too. Also in terms of speed, since Refract 2D precalculates displacement offline, yet still retaining the ability to combine displacements at runtime, there is much less work to do to layer 3 or 4 Distortion Maps on top of each other in a single shader. Without that offline component there would be various ray-casting needed in realtime which would slow it down quite a bit. So it's faster than it might've been. The main bottleneck I've found boils down to how many textures you use. The computations in realtime don't seem to use much time compared to adding an extra texture lookup.

    In terms of other packages optimized for pro, I haven't made any. This one isn't really optimized for pro, it just so happens that there are some pro-only features (render textures) that you can use and which have various benefits. My Shader Wizard utility is similarly optimized for performance, if you're interested.
     
  44. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Version 1.2 is now available for download https://www.assetstore.unity3d.com/#/content/9854 - minor update, including the 3d interactive demo scene, documentation on lighting and shadows, and compression of some textures to improve the loading time.

    Price is now reduced to only $40 for a short time :) 20% OFF!
     
    Last edited: Aug 7, 2013
  45. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Still only $40 for a limited time!
     
  46. im

    im

    Joined:
    Jan 17, 2013
    Posts:
    1,408
    hi

    i have unity 3d indie (free)

    im doing 3d game with 3d objects

    so can i with this asset to underwater effects

    for example if the player is above the water and looks below the water can i make anything below water distorted

    also if the player is below the water can i make everything in, on and above the water distorted

    like in real life

    i guess i would have to have a plane for the water and apply the shaders to it so see below water objects look distorted when the player is above the water

    but when im in the water can i apply the same trick to my camera so i could see everything distorted? or how else would i do it?

    thanks in advance
     
  47. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
  48. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    To accomplish what you're trying to do, let's say you are above the water first... you'd need to render what is `below the surface` first, either to a RenderTexture or render it and grab it into a texture (which is quite slow). It needs to be rendered in 3D and then grabbed as a 2D texture. Then you can do a second pass and render just the water surface, using the texture you grabbed to distort it. It won't look quite physically correct because the bumps on the surface are perpendicular to the main camera and not to the surface itself - unlike parallax mapping, and more like normal mapping, they aren't correct in perspective. But it would look... somewhat reasonable still. Then when underwater you'd have to change it, so that you draw the walls/sky first either to a RenderTexture or grab it, and then draw the water surface to distort it. As you probably know RenderTextures are only supported in Unity Pro, unfortunately, and ReadPixels is quite a slow operation except in lower resolutions on desktop computers. I also can't guarantee that the 3D is going to look right. If you have a look at the 3D demo I posted (see above or front page of thread) you can move around the water and you'll notice that basically it is flat because it doesn't grab a proper 3d background, and the bumps aren't really 3-dimensional. It would be great to be able to do what you're trying to do but without RenderTextures I don't think it's really very feasible, and even with them it's going to be a bit of a hack.
     
  49. Deleted User

    Deleted User

    Guest

    very good work! purchased ! :)
     
  50. imaginaryhuman

    imaginaryhuman

    Joined:
    Mar 21, 2010
    Posts:
    5,834
    Thank you very much! It's an exciting day when your asset goes on an official sale!