Search Unity

Normal maps (and importing them correctly)

Discussion in 'Community Learning & Teaching' started by Frank Oz, Mar 21, 2011.

  1. Frank Oz

    Frank Oz

    Joined:
    Oct 13, 2010
    Posts:
    1,560
    The purpose of this thread is to help out those who are just getting into it, or a little bit lost. It's not a contest to see who can post the most mathematically correct details or bang on about splitting vertices etc.. That just confuses people and very few need to know all that until much later.

    What are normals?

    A normal is the direction of any polygon surface. Each polygon can only face in one direction at a given time.

    Is this related to Polygon Smoothing?

    Yes, Smoothing Angles or Groups, is a 'cheat' method which is used to make a surface appear smoother. It does this by altering the normal directions of certain vertices to create an average direction between two adjoining polygons to give the impression of a smooth surface when lit. Because it's a cheat method, the silhouette remains the same.

    The angle of smoothing is the angle the smoothing will begin to occur between two connected polygons. The higher the smoothing angle, the more a surface is smoothed. For example, a smoothing angle of 45 degrees means only polygons that are at a 45 degree or more angle to connected polygons will be smoothed, while all others will remain the same.

    So what are normal maps?

    A normal map is a specially treated image that tricks a compatible 3D engine into believing a surface has more normal data than it actually does, and therefore giving the impression of greater detail, or smoother surfaces. A normal map simply makes the software believe a single polygon is made up from thousands of polygons, but without the processing cost of dealing with so many polys.



    This is a tangent space normal map


    This is a world/object space normal map


    This is not a normal map


    This is a bump/height/displacement map

    So what's the difference?

    World Space and Object space normal maps are basically the same thing, depending on how they're used in a game world. They aren't currently used by Unity and only mentioned here to further confuse people.

    Because they both use the full spectrum of colors they are: a) Better quality than Tangent Space and b) More limiting than Tangent Space.

    World Space
    This type is fixed within the world, which means they cannot move, rotate, tile or bend. Doing so will cause them to appear incorrectly. Because of this, world space normal maps are best reserved for static objects like buildings.

    Object Space
    These can move and rotate, but cannot tile or bend, or the same appearance issues will occur. They're best used for Rigid dynamic objects.

    Tangent Space
    These can move, rotate, tile and bend without harming their appearance. These are used in Unity and most if not all other 3D engines available today.

    Bump Map
    Bump Maps are not as advanced as normal maps as they contain only height information (explained below), but can bend, rotate, tile, move with no adverse effects.

    So how do they work?

    Every color value in a normal map represents an angle, using 128 for the plus direction and 128 shades for the negative. Red, Green and Blue. This when all added together is what gives a normal map it's rainbow appearance.

    Generally the following is used (assuming Y +/- equals up/down)

    Red = X +/-
    Green = Y +/-
    Blue = Z +/-

    World and Object Space maps are generated with an understanding of their position and orientation, and what they consider up on their map, is an absolute up within the world, hence you can't do much with them. Each pixel is capable of facing in any direction. The quality is usually deeper and richer. But unless they are handled specially, there's little to be done with them.

    Tangent Space maps are considered by the engine to always be facing 'up' from the surface of the polygon they're used on (so you don't get much change in color on the Blue channel), so whatever direction this type is facing, it still thinks its facing up. Which is why they can be rotated, moved, bent and so on. Their directions are relative not absolute, making them perfect for realtime use.


    Take the image below. The black line is a cross section of a model. The yellow lines show the direction of the actual polygon surface normals. This is what the engine sees when it's lighting a regular surface.



    Now if we were to add a normal map to our model, the engine would see the following. The original base normal directions continue to be taken into account, but are now including the normal directions. Giving the impression of far more detail.



    Ok, what about bump maps then?
    Bump maps are how things used to be done. They are grayscale, with each value representing a specific height. Unlike normal maps they contain no directional information. This means each pixel of the bump map has no knowledge of actual light direction. For the purpose of games, and specifically Unity. They're useful for generating the height of normal maps (see the next post), the height information for use in parallax and relief map shaders, generating the height information in terrain, and creating actual displacement geometry when used on high polygon objects.

    Further Reading
    Normal Maps at Polycount - from Farfarer
     
    Last edited: Mar 21, 2011
    DougRichardson and GibTreaty like this.
  2. Frank Oz

    Frank Oz

    Joined:
    Oct 13, 2010
    Posts:
    1,560
    As everyone on here uses different programs, Blender, Max, Maya, ZBrush, Mudbox etc. I think it would be useful if we could have a thread in here which focuses only on normal maps and any special requirements that may be needed when bringing them over from different applications.

    The reason for this, as many will know, is for some reason different applications will generate and use normal maps differently. By this I mean some will flip color channels, some will flip UV coords (which applies for textures in general but whatever), and chances are what works in one place, doesn't look quite right in another.

    Obviously I haven't and can't use every program on the planet. So please post the app you use, and if you need to set anything up which differs from the normal (pardon the pun) to correctly generate normal maps that Unity can use. The more programs we can include here the better. If the thread is popular, I'll update this post with more info as people provide it, to save others from trawling through the thread. With luck it can be useful to someone.


    Unity 3.x
    Why include Unity? Because you can generate normal maps within Unity by using any image you've imported. This method requires the use of an alpha channel which Unity will then use to estimate the height of each pixel before discarding the alpha channel and leaving you with a usable normal map in the correct tangent space format.

    As I just mentioned, the alpha channel on all textures tagged as normal maps will be removed. If you have previously used the alpha on such textures, for example, when using a parallax shader. You'll find you have to put that alpha into another texture for it to be used. Alternatively you can duplicate that normal map, change its type to a regular texture and use the duplicate for it's alpha channel only.

    Normal maps generated within Unity via this method are not as accurate as creating one directly from geometry, but they will often appear cleaner, you also have the benefit of quickly fine tuning the depth of the normals. I would recommend that you should where possible, generate your normal maps this way unless you have a reason not to (like with high poly modeling).

    It is important to be aware that prior to version 3, Unity didn't consider a normal map to be different from any other texture. This has now changed. If you find after converting an old 2.x project into version 3, your normal mapped surfaces don't look correct, this is because Unity needs to know they are normal maps. Changing the texture format for each imported texture, or clicking the fix now button that will show in your materials will do this for you. You may also in some cases be shown a window with a list of textures which need to be converted, let it do its thing.

    Finally, do not use compression on these textures. Normal maps are very sensitive to slight color changes, and compression can and will visibly damage the end result.

    -=\|/=-


    ZBrush 4.0
    In many workflows. ZBrush requires you both to flip the Green channel AND the Texture before exporting. A lot of tutorials will mention the FlipG button. Chances are, you have it defaulted to do just that (I know I did). But this isn't required for Unity. The difference in the normal map is small but noticeable (bumps become pits, pits become bumps).

    So when generating your normal maps via ZBrush, ensure none of the color channels are flipped and that you have the Tangent button selected.

    Flipping the entire texture vertically however, is still required to ensure it correctly matches your chosen UV's (it will then look messed up in ZBrush, but work correctly in other programs, including Unity).

    -=\|/=-


    Microwave (Lightwave Plugin)
    Default normal map settings work correctly within Unity, but be sure to set the normal map to Tangent mode.

    -=\|/=-


    nVidia Normal Map Filter (Photoshop Plugin)
    Default settings will work correctly. Make sure you have your alpha channel selected when generating your normal map, or it wont take your height data into account. Check Alpha Channel for Height Source inside the plugin.

    -=\|/=-​











    To be continued.........?
     
    Last edited: Mar 21, 2011
    DougRichardson likes this.
  3. afalk

    afalk

    Joined:
    Jun 21, 2010
    Posts:
    164
    Outstanding information, thanks for sharing this!
     
  4. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
  5. 2dfxman1

    2dfxman1

    Joined:
    Oct 6, 2010
    Posts:
    1,065
    A tip:
    Normal map defines angle of surface, it does not define depth.
    So if you take a plane and extrude it down, the normal will not change and you will not see that extrusion in a lowpoly mesh.
    Hence why people use subd, because no matter what the edges will never be 100% sharp
     
  6. ivanzu

    ivanzu

    Joined:
    Nov 25, 2010
    Posts:
    2,065
    Thanks frank oz.
     
  7. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
  8. XRA

    XRA

    Joined:
    Aug 26, 2010
    Posts:
    265
    normal maps exported from 3D Coat seem to need the Green channel inverted / flipped for proper display in Unity
     
  9. pickledzebra

    pickledzebra

    Joined:
    Feb 25, 2010
    Posts:
    33
    Thank you Frank Oz. This is perhaps one of the most useful posts I've encountered.
     
  10. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Another great post on Polycount from EarthQuake about baking normal maps and the differences smoothing groups and low poly meshes can make;

    Understanding averaged normals and ray projection or "Who put waviness in my normal map?"


    It's also probably worth noting WHY the normal maps lose their alpha channel when tagged as normal maps in the Inspector;

    As normal maps are very sensitive to slight changes in their colour, regular compression can often cause very noticeable flaws. With DXT1, the red, green and blue channels only get 5, 6 and 5 bits respectively with which to store what is originally an 8 bit value - that's a lot of lost bits.

    With DXT5, it's the same for RGB but the alpha channel recieves a full 8 bits (i.e. it remains uncompressed).

    So what's done when you tag a texture as a normal map is that it ditches the blue channel completely and puts the red channel into the alpha channel. This means that you now have the two most important channels - the Red and Green - receiving the least possible compression.

    This is a technique commonly known as "swizzling".

    Inside the shader, the blue channel is recreated on the fly (as normal maps should be normalized, if you know the value of two components - the red and green - and that their vector length should be exactly 1, you can recreate the third - the blue). This is what happens inside the UnpackNormal() function you'll see in the shader code.

    So where, with DXT1, you had R/G/B > 5/6/5 bits, with DXT5 you now have R/G/B > 8/6/8 bits (with the downside of a slightly larger file as it now has an uncompressed alpha channel).

    The net result is that, while sacrificing the alpha channel, you drastically improve the compression quality and thus the fidelity of your normal map giving less compression artifacts and better looking normal maps all round.
     
    Last edited: Jul 13, 2011
    Gizambica and chrismarch like this.
  11. pinkhair

    pinkhair

    Joined:
    Dec 17, 2010
    Posts:
    141
    For Lightwave without Microwave, the free plugin DPKit comes with a Normal Cast node which produces Unity usable maps with the default settings, though you do need to set render flags to use the surface baking camera with it.
     
  12. EddieChristian

    EddieChristian

    Joined:
    Oct 9, 2012
    Posts:
    725
    I'm looking for an easy method to get Mirrored UVed normal maps out of Zbrush to Unity.
     
  13. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Hello there,
    I've drawn my fair knowledge about the technical aspects of normal mapping from two sources:
    http://www.svartberg.com/tutorials/article_normalmaps/normalmaps.html
    and
    http://www.polycount.com/forum/showthread.php?t=107196

    But Unity doesn't quit to impress me... in a negative sense that is ;)

    If you followed along the explanations given in the second link you know that a UV split equals a hard edge. If you don't want to split UVs (thus keeping a lower vertex count) you pretty much have to smooth the corresponding edge(s).
    As long as I follow these basic rule everything looks fine inside Maya. No seams or whatsoever even on 90 degree angles. As soon as I enter Unity however the shading falls apart:

    First picture shows the desired mesh smoothing/workflow. Second picture shows what is needed to display the normals correctly inside Unity.
    $Unity_TangentSpaceNormalMappingIssues_01.jpg
    --
    $Unity_TangentSpaceNormalMappingIssues_02.jpg

    The weird thing is that I think the first workflow might actually work IF Unity was able to import the tangents and binormals from the given fbx file.
    But either Maya does not export them (although checked on the export dialogue) or Unity fails to import them, because I get a warning about missing tangents once I select "Import" on the tangents drop-down selection inside Unity.
    If anyone knows what I might be missing here please don't hesitate to write me a line here. Thx.
    If we come to the conclusion that this is a bug I would file a report.
     
    Last edited: Jan 27, 2014
  14. Dantus

    Dantus

    Joined:
    Oct 21, 2009
    Posts:
    5,667
    It looks as if the normals of the vertices are wrong. If that is the case, it doesn't matter how good your normal map is. Could you show how the object looks with a diffuse shader?
     
  15. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Sure. The first to the left is the shaded mesh after the approach/workflow I'd like to use.
    Second mesh to the right is the workflow that produces normal maps that work both in Maya and Unity - but which results in alot more vertices.

    $Unity_TangentSpaceNormalMappingIssues_03.jpg
     
  16. Dantus

    Dantus

    Joined:
    Oct 21, 2009
    Posts:
    5,667
    You have to use the approach which has more vertices. The reason is that each vertex has a normal. That's the way Unity works and that's the way all real time 3d applications work. Each corner in your object needs three normals to be visually correct. As a consequence that means you need three vertices per corner.
     
  17. Marco-Sperling

    Marco-Sperling

    Joined:
    Mar 5, 2012
    Posts:
    620
    Thx for your thoughts. But according to this quite long elaboration you do not necessarily need splits if you can live with shading gradients to some extends:
    http://www.polycount.com/forum/showthread.php?t=107196

    I can live with these gradients on smaller assets as long as I don't get tangency breaks on the edges resulting in ugly seams - which Unity is giving me while Maya is not.

    To sum it up: you should be able to bake normal maps with the first approach and it should work in any engine that has correct tangent calculations.
    The Maya viewport actually shows that this process works. I can get clean shading results with both approaches as long as I follow the one rule: UV split = hard edge.
    If I break this rule Maya is giving me seams, too.

    Of course, for more prominent/larger assets I would use support edges or more UV splits. But smaller stuff just isn't worth the effort. As long as there aren't obvious seams.

    Edit:
    Regarding the storage of vertices and normals: What you've said is true to some extend. The graphics card gets fed a list of vertices and a list of triangles that uses indices into the vertex buffer.
    This way you do not need to store 3 vertices for a corner if all 3 vertices share the same properties.
    While practically you may be redrawing vertices on the graphics card for triangles that share a corner it is always looking up the same vertex information for all 3 triangles for that corner vertex - IF the vertex properties don't differ.
    Which should be the case for vertices with smoothed/averaged normals, sharing the same UV coordinate and vertex color.
     
    Last edited: Jan 27, 2014
  18. Dantus

    Dantus

    Joined:
    Oct 21, 2009
    Posts:
    5,667
    Unfortunately I won't be helpful anymore, because I don't see this as Unity issue. Unity handles vertices and normals differently, because that's the way all real time applications work as far as I know. Maya and many other 3D applications do it different.
    You may think about starting a new thread to discuss the workflow from Maya to Unity. I know that many had comparable issues other 3D applications. Maybe one or the other person share their experience and how they solved that kind of issue.