Search Unity

Calculating Unity's tangent basis for xNormal?

Discussion in 'Shaders' started by SONB, Jan 12, 2010.

  1. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    You'd be surprised at how much "proper normal mapping" as you called it is very much today's technology.

    As far as I know... CryEngine3 and idTech4 are the only two engines which provide bakers that are synced to their tangent basis. And the CryEngine3 one is simply an xNormal plugin like the Unity one is. 3DS Max has only very recently been able to sync it's own internal viewport and it's baker. And Autodesk own both Maya and 3DS Max - their tangent bases don't match up.

    Unreal Engine's tangent basis still remains a mystery to most.

    People didn't know about this stuff until very recently. They just thought that's how normal maps work, you had to do fiddly stuff to get them looking smooth.

    I think you're under the impression that Unity have somehow implemented a bad tangent calculation method. They haven't. They've just done their own one (as every other game engine developer and 3D app software developer has done).



    Regarding your issue:
    You don't have to make every UV split a hard edge. You need to make every hard edge a UV split. (i.e. you can have UV splits that aren't at hard edges, but you shouldn't have hard edges that aren't at UV splits).

    You'll always get better looking results from more UV shells and hard edges. Regardless of whether you use UnityTSpace or MikkTSpace.

    My method doesn't magically make that not true, it just ensures that the normal map helps counter that as much as possible. So if you ran the test you did in that first image using my plugin, the results would, technically, look even better than MikkTSpace's.

    It's just the way that normal mapping works. Of course, all those UV splits and hard edges makes it a pain in the arse to texture and it creates more vertices for the engine to have to cope with, which is why you want to avoid it.

    Aside from that, I'll look into it.
     
    Last edited: Jun 22, 2012
  2. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    About other engines... At least they have any tools to let artist bake normal maps as they should. We almost can say, that unity has it now, thanks to you.

    But Unity didn't have it for all that time. What does this mean? It means that our team has a lot of assets in the project that's currently frozen. Why is it frozen? Because we suddenly discovered that these assets doesn't properly display normal maps and there was no tool to make proper ones. And now we need to re-bake all the maps to continue this project. And even create another UV layout for some of them. Which is quite a problem, considering that all the textures are already painted and this means we'll lose in texture quality due to map transfer. That's just a real-life example.
    I'm posting here all this stuff not because I have nothing to do. I posted it because we really need a tool for baking normals. Which Unity even won't have if you didn't try that hard to write this plugin.

    About Autodesk products... I don't even want to talk about it. For the past few years what they publishing as a working product is actually very buggy and constantly crashing beta-versions of something they're promising they will eventually publish. If Unity is simply "a little less than not perfect", then last versions of Maya, MotionBuilder etc. is bad. Very bad. So bad, that I can't even call it software - it's just a bunch of very buggy code randomly taken from here and there.
    There's no surprise for me that tangent basis in Max and Maya don't match.

    You're right, I was. A few pages before in this thread. But now I simply think: "Current normal maps in Unity don't work with any other software. They'll need to fix it anyway. Isn't it better to simply replace current notmal-mapping algorithm with something that's going to become a standard?"

    As for hard and soft edges... It's just simplier to create UV unwrap at some key seams (located at edges that supposed to be hard). And then convert all edges at this seams to hard. Of course, there'll be some hard edges that aren't necessary. But they don't hurt, right?
     
    Last edited: Jun 22, 2012
  3. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Well, you said you didn't notice a difference before, so I'm not sure why you're freezing your assets. This is simply a technically better way to generate your normal maps. If you've not noticed flaws before, then this shouldn't be too big a deal. It's not Unity's "fault". Even then, all you'd have to do is re-bake your normal maps using this plugin rather than MikkTSpace.


    Hard/soft edges... create them as you like. It makes no difference.

    The only "rule" (read: guideline) is that if you have a hard edge on your model, you should have a split in your UVs along that hard edge, with a gap in the UVs to allow for padding.

    The reason for this is that if you have a hard edge and no split in your UVs, you get a seam on the normal map. This can cause issues where the normal map is sampled from pixels that are right on that hard edge's UVs. The result is that on either side of the hard edge, you can get shading flaws as it samples normals that aren't meant for that polygon.

    You do NOT have to create a hard edge wherever you have a split in the UVs. It will make absolutely no difference to the outcome. Only the reverse is true;

    Split in shading (i.e. hard edge)? Split the UVs.
    Split in UVs? Doesn't matter.


    That's just standard practice for UV mapping when you're using normal maps, doesn't matter what you're baking your normals in or what you're rendering them in.
     
  4. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Ok, I've had a look and... well whatever you're exporting with is feeding awful normals in. It's not an issue with my plugin, it's an issue with your mesh. I get the same bad results when using the MikkTSpace plugin as I do when using my plugin.

    Not something that I've broken, I'm afraid.

    Here's the mesh on the left with your vertex normals, on the right with the model loaded into modo and re-exported. This is simply using VertexLit shader.

    You can see the massive difference in shading. Your app's giving it very strange normals indeed.
     
    Last edited: Jun 22, 2012
  5. Kuba

    Kuba

    Moderator

    Joined:
    Jan 13, 2009
    Posts:
    416
    Thanks!
     
  6. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    I didn't have any troubles with normal-mapping before I started to work in Unity. I was never even bothered about creating special UV layout for normal-maps. Yes, it's standard practice. And yes, it's my fault that I didn't follow it. But I didn't nottice any artifacts before either. So when we started to work on our project (currently frozen) I couldn't even imagine that we can have such a big troubles with normal maps if we won't create "normal-mapping oriented" UVs. And therefore we didn't try to create UV layouts optimal for normal-mapping, we did it optimal for texture painting. I mean, less pieces, no hard edges at all.
    It was quite a surprise when we imported our textured assets and found out that there are some artifacts in normal-mapping. To fix it, we need now to create another UV layout for each "problem" asset, and transfer all the textures (not only normal maps) from the old ones to the new ones. Which means obvious loss in texture detail.

    It does make a significant difference. If we have no hard edges then there's a huge normal interpolation along each 2 polygons that have big enough angle between them. And that's where artifacts becomes to appear. So there are 2 solutions: either add support edges (which increases polycount and therefore is not an option) or make hard edge anywhere the angle between polygons is big enough (which makes us to add UV seam and therefore increases the number of UV shells).

    Oops... Looks like you're right. Just tried to apply to the mesh a script that rebuilds it completely (to kill some bugs if they're there) - and yes, it looks differently even in Maya viewport. Then I re-baked normals and imported it to Unity and guess what?... All the artifacts are gone!
    I have to apologize for misleading you with false bug report. Sorry for that, I did this test with less attention than I actually should.

    Here's the final comparison table from me:



    Well, to be honest, not all the artifacts are gone. There's still some. But they're very difficult to notice and they appear only at the areas that I did with extreme normal interpolation just for testing purposes. I also think, that they may appear because I'm using 8-bit png. Maybe, 8 bit is just not enough to represent normal vector variations at extreme angles.
    In real production example, I think, if you create hard edges at least where they're really hard, you shouldn't notice any artifacts at all.

    So I can confirm:
    Hallelujah, now we have production-ready workflow for using normal maps in Unity!
    As for me, I was waiting for this moment for all that months. And I'm very grateful to you, Farfarer for an amazing work you've done. I don't want to promise anything, but if our current game (BallBot) will bring me some more money than I expect, then maybe there's a way I can thank you more "practically"? Some donation for your plugin or kind of that?
     
  7. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Actually, if you keep your UVs and normals unsplit, a bevel is just the same as adding a shading OR a UV split.

    The engine has to duplicate those vertices anyway. If you keep the normals and UVs unsplit, then it adds exactly the same number of vertices.

    Tri-stripping makes it just as efficient to add a smooth, UVed bevel than it is to make a smoothing split.



    As for paid compensation, thank you, but I'm contractually forbidden to make any money off things I create, so no worries there :) (unless you want do donate me a Unity Pro license - mine runs out in 56 days :/)
     
    Last edited: Jun 22, 2012
  8. Findus

    Findus

    Joined:
    Jun 23, 2012
    Posts:
    111
    Thanks a lot Farfarer, you rock! :D


    @Unity developers:

    If you ever decide to implement another normal mapping standard, please keep the old one as an easily accessible option.
     
  9. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    Farfarer,
    Again, I don't know about how it's working there, but what I know is what I see. The same mesh with smooth edges only (single UV shell) and with hard edges (obviously, split into several UV shells) is rendered completely different at Unity. The second way gives correct shading while first one does not.
    It was extremely noticeable when I was baking using MikkTSpace. Now it's almost imperceptible, but... almost. It's very small, but it's still there. As I mentioned before, it may be caused by 8-bit limitation for PNG. But it becomes visible where high amount of normal interpolation occurs. I'll try to use EXR as soon as I can have some free time for testing and see what I get.

    Unity Pro license... Don't know, It has quite a significant price. I won't promise anything right now - I don't know how successful our game will be. It would be better to have a pleasant surprise if I get much more money then I expect from our game then to have a promise hard to fulfill if I won't.
     
  10. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Heh, I was joking about the license.

    Yeah, exr is over the top for normal maps. TGA or BMP will do you fine. Unity's going to compress them for you anyway.
     
  11. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    Hm... if you think, 8-bit bmp (or lossless png) is enough then I don't know what's the reason for remaining artifacts. You can see them even at the screenshots that I have posted above. They're difficult to notice but they're there:
     
  12. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    There's simply a limit to what tangent space normal maps can do. Synced tangent basis helps, a lot, but it's not magical.

    Those errors'll be down to mipmapping of the normal map and extreme normal variants.

    Not a lot you can do about it, other than add hard edge/uv split or a smoothed support loop.
     
  13. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
  14. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    Thanks a lot for the 64-bit build. Yeah, I know, I'm late with this post, but it's never too late to thank someone, right?
    It would be awesome if someday you also make your plugin work with hxGrid renderer. I know, there is no performance benefit. But it will make it much easy to set up baking jobs for xNormal (I mean, export settings to xml and then call xnormal.exe from bat file).

    Offtop:
    Just noticed a mention about me in your blog post. It's surprisingly, but it's pleasant. :)

    The last words about artifacts:
    If using completely smooth mesh (without hard edges at all), we still can't get good results, even with your plugin. Yes, it does make normal maps look better, but it doesn't vanish artifacts completely. Some of them are always there. The worst thing here is that you never know how noticeable these artifacts will be when you import your mesh in Unity. They may be almost invisible, but they may be very noticeable.
    The only solution here we found is using old-school method: adding hard edges anywhere they're supposed to be and, correspondingly, adding UV seam at each hard edge.
    As a result, there's a tremendous amount of UV shells, which makes texturing real pain in the ass. So, if someone interested, ...

    Here's workflow that our team has came to:
    • We have one UV layout for texturing. It has as less UV shells as possible.
    • Texture artists get the mesh with this UV layout. We also send them draft normal map and AO just to use as a reference during texturing.
    • If texture artist adds some details to the normal map, he does that at separate photoshop layer.
    • We have the second (in-game) UV layout which has all the necessary UV splits at each hard edge.
    • After texture artist has finished his work, we manually transfer all the maps (except normal map) from mesh with 1st UVs to the mesh with 2nd UVs (using xNormal).
    • I'll describe with more steps how we transfer normal map details to the mesh with 2nd UV layout:
      1. We merge all the normal map details layers on top of flat "blue-violet" (127, 127, 255) background (not on top of main normal map). So this way we have normal map that has only added details.
      2. Then we re-bake this "normal details" map from 1st UVs to second UVs in xNormal. The important thing here to mention is you really need to:
        • import mesh with 1st UVs as hipoly and apply this "normal details" map as texture to bake.
        • make sure "Bake texture is tangent-space normal map" checkbox is ON
        • at Bake options tab, select "Normal map", not "Bake base texture". This is very important because, once baked, any normal map can't be rotated in any way. Which, obviously, will happen if you have completely different UV layout. So you can't just transfer normal map texture from one UV layout to another. You need to re-bake it.
      3. Obviously, we need to create main normal map for 2nd UV layout. Using Farfarer's UnityTSpace, of course. If you apply it to your mesh at Unity, it should look properly already (unlike the 1st one). Just there's no details.
      4. And, finally, we need to combine "normal details" texture and main normals texture for 2nd UV layout. The important thing here is you can't simply place one normal map above another in photoshop using any of the blending modes. You need to do it as it's described in this article (look at 4.4.2.Normal maps in photoshop, starting from words "Another thing you could do is to overlay two normal maps inside photoshop").
    • As the last step, we combine some textures into one if needed. For example, Unity's default shaders read specular from main texture's alpha. So we need to place specular map to the alpha channel of main texture. We use Nuke for this task since Photoshop brakes main texture: anywhere specular is completely black, it makes main RGB black, too.

    I know, that some of the steps I described here are standard practice and many of you guys know that. I just tried to describe the entire workflow, step by step. For texturing artists who don't want to learn all that technical stuff, just would prefer a final recipe.
     
    Last edited: Jul 25, 2012
  15. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    I think you're running into issues with the limitations of tangent space normal maps, not limitations of my plugin or with Unity. Angles of greater than 90 degrees are still going to cause issues, regardless of the solution. If it's a complex surface, angles of 90 degrees can cause issues. It's just too big a change in surface normal for the normal map to cope with gracefully.

    There's also the mathematical issue of how the vectors are transformed in the shader, which there's nothing I can do about for Unity's built-in shaders.

    Essentially, the light vector (or view vector, or anything else you want to take from object to tangent space, I'm just going to use light vector as the example here) is transformed from object to tangent space per-vertex and the resulting vector is fed through to the pixel shader as an interpolated value between the 3 values at the vertices of the given triangle.

    However, in the baker, the surface vector is transformed from object to tangent space per-pixel.

    Unfortunately, that results in a slight discrepancy in the result because the two transforms are not performed identically anywhere other than exactly at the vertices.

    If you're going to get that anal about small discrapancies, then you'll have to write custom shaders that feed the normal/tangent/binormal/light/view vectors into the pixel shader and does the transform there per-pixel.


    Also, as evidenced from the models you've sent me, whatever you're using to export is giving really wonky vertex normals, that's going to create major issues both with baking and with lighting in general. I'd look at fixing up the vertex normals after your model's been built and before exporting. That should fix a lot of your issues.


    Also, "flat" normal map is 128, 128, 255... not 127 (see: http://wiki.polycount.com/NormalMap?action=AttachFile&do=view&target=normalmap_127seam.jpg from http://wiki.polycount.com/NormalMap ). And if you're wanting to properly merge two together, I'd look into CrazyBump (see: http://wiki.polycount.com/NormalMap...t=nrmlmap_blending_methods_RTTNormalMapFX.png from the same page on the PolyCount wiki... Paul Tosca's method - the method you are following - is in the bottom right for comparison).


    Aaaalso, Photoshop does not break anything by putting the spec map into the alpha channel of your texture. Are you certain you're doing it right? You want to go to the Channels palette and if there's not already a channel called Alpha 1, you hit the Add New Channel button at the bottom of the palette and then, with the Alpha 1 channel selected, you paste your spec map into there. To get back to RGB, you want to select the RGB channel.
     
    Last edited: Jul 25, 2012
  16. UNITY3D_TEAM

    UNITY3D_TEAM

    Joined:
    Apr 23, 2012
    Posts:
    720
  17. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    Farfarer,

    First of all, in my last post I didn't mean that there's something wrong with your plugin. Yes, this is limitations of tangent-based normal-maps technology itself. I have agreed with you the first time you told that. Just didn't think it woth mentioning.
    What I tried to do is share complete workaround for proper use of normal-maps in Unity. To get the maximum from the technology (pushing it to its boundary).

    About the shader... Yes, I used your shader to test normal maps. And we're going to rewrite all our shaders to make them calculate normal maps the same way.
    By the way, is it possible at all to make your plugin calculate normals per-vertex, or at least simulate this behavior? I mean, do it the same way Unity default shaders calculate normal vector along polygon. So we don't use per-pixel normal calculation in Unity, but instead per-vertex calculation in xNormal. As I understand, your current workaround forces us to use per-pixel calculations for everything in the shader, which is very inefficient.

    As for "broken models"... Yes, there was something very weird in normals when I sent you test meshes. I have confirmed that in the next post after you pointed me in that direction.
    We're exporting from maya. Now, after lowpoly mesh is created and UV-mapped, I launch some script which recreates the entire mesh polygon-by-polygon. To kill some bugs with normals if there are some. And only then I set hard edges where needed.

    As for links to polycount wiki... Looks like you already gave it to me. And I lost it in the dump of bookmarks. My fault. Sorry for that.
    I'll take a closer look at this article this week and then update my previous post to make baking instructions more technically correct.

    As for Photoshop vs Nuke... I'm not a photoshop guy, that's true. When I tried to place specular in alpha using PS what I was doing is add a mask to the RGB layer and place specular texture there.
     
  18. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    It just seems a very complex workaround for the sake of texturing. I fear your texture artists might just have to get used to working with seams if you want an easy life. I certainly haven't had to jump through any hoops like that to get decent results.

    Bear in mind that an extra edge bevel that let's you get a harder looking edge but keep the smoothing and UVs intact is actually exactly the same as a hard edge and/or a UV split as far as the game engine is concerned. You might find it easier to just do that rather than the complex 2 UVs, twice-baked normal map and diffuse map method you're having to use now simply to get around a few artefacts.

    As for fixing the bake to match vertex-transformed rather than pixel-transformed light vectors? I don't think it's possible and if it is, the maths involved is waaay beyond me...

    And yeah, a layer mask simply masks the layer (like an alpha map for that layer alone), alpha stuff has to be added to the alpha channel of the image, not the layer.
     
  19. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    Yeah, this workaround is very complex, I don't argue. But this is the only workaround which results I'm happy with.
    Using extra edge loops? Not a solution. This will extremely raise the polygon count. We want to keep polygons as less as possible, not to crank it up.
    Using one UV set? We can use one optimized for either texturing or normal-mapping. If we use the 1st one then texturing becomes almost impossible task (at least unless you use soft like 3D coat). If we use the 2nd one then I bet you will get a lot of artifacts, no matter if you use UnityTSpace plugin or not.

    This leads up to one simple thing: if you want to have good-looking normal maps without raising ploycount, you need to have 2 different meshes of the same model: one for texturing, the other one is imported into game engine.

    I'm just trying to make the best normal mapping Unity is capable of. The problem here is that Unity just can't render "good enough" normal maps. It's either renders the horrible result or the perfect result (if you "jump through a hoops" like me).

    The key thing I'm trying to say here is you still have to do some extra work to create normal maps that will render in Unity properly. This work is the same as if you won't use your UnityTSpace plugin.
    Your plugin eliminates all the rest, very little, artifacts, making the model perfect. But it can't replace all that "jumping through a hoops". Sad but true.
     
    Last edited: Jul 26, 2012
  20. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Extra edge loops may technically add to the polycount but if the smoothing and UVs remain unbroken, it doesn't add to the vertex count, which is what the game engine sees.

    Which is why I said it's just as efficient to have a bevel that allows you to keep the smoothing and UVs contiguous as it is to put a hard edge and UV split.
     
  21. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Is there any chance that you could share the cube used in the screenshot for the xnormal plugin? I am trying to wrap my head around, using your shader, made a similar looking mesh, but i just cant get the correct results.
     
  22. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Here is a screenshot. I'm using the shader from the previous page, (also testing though with built-in and a custom one). Recreated the simple mesh, laid out the UVs the same way. I am trying to average (unifying? / single smoothing groop?) all the normals. I export a single smoothing group from max, in xnormal i've tried a few settings, i'm using unity tangent space, but i just cant get the smooth face as you get on a previous page.

    $1.jpg


    *edit*

    I just noticed now that the top right facing face is pretty smooth. All the rest pick up the triangulation. Am i not looking at these properly, or is something different happening in the sample image!

    *edit2*

    Attaching another example, max -> xnormal -> unity

    $2.jpg


    It actually looks very much like what i expected (based on 3point shaders), but i cant wrap my head around the triangulation. Unfortunately this is with imported binormals.

    This is what i get when i follow the entire procedure, but i must be wrong somewhere?

    $3.jpg
     
    Last edited: Jun 20, 2013
  23. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Looks like your modelling program is giving out bad normals and UVs.

    Can you send me the low poly mesh you have imported into Unity and your high poly mesh you're giving to xNormal and I'll double check?

    jamesohare@gmail.com
     
  24. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Got it closer, by using obj instead of fbx, still doesnt look as good as 3 point shaders. $4.jpg
     
  25. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Using the FBX files you sent me, it came out similar to your last picture. Which is about as good as it'll get from your low poly mesh and UV layout.

    Ensure you're using the latest version of the plugin - I updated it maybe 3 weeks ago to fix a bug I found which gives results similar to that one.
    Requires xNormal 3.18.1 (I'll update it to 3.18.2 when I get a chance).
    > Plugin
    > xNormal Version

    Also ensure you've set the mesh's Smoothing Angle to 180. In the Inspector;

    Normals > Calculate
    Smoothing Angle > 180
    Normals > Import
    Tangents > Calculate

    And if you can, normalize the light and view vectors in the pixel shader or surface shader lighting function before using them.
     
  26. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Should you be able to get rid of the mirror seam by using this workflow?.


    I have tried flipping the .w of tangents by running a script that checks if the UVs fall out of 0-1. But nothing really happens in terms of shading.
     
    Last edited: Jul 9, 2013
  27. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    As far as I know you shouldn't need to do any tricks to fix mirroring...
     
  28. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    I keep getting a seam but i can't quite track it down, it seems like it is always there no matter what shader i try. If it's got a normal map, there's a seam, even if it's 128,128,255.
     

    Attached Files:

  29. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Where it's mirrored, is there a split in the UVs or have you simply folded it over? I think you'll need a split in UVs along the fold for the best results.

    Also, is this in deferred or forwards?
     
  30. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Deferred lighting, and there is a split. Half of my model is in in 1-2 space. It's quite odd, as soon as there is a normal map, it shows a seam, no matter which shader it seems.
     
  31. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    If it's still there when you apply a flat normal, I'm guessing there's a subtle difference in the vertex normals on either side.
     
  32. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    But why? I've tried to make sure that it's the same normal. I can't figure out how to isolate just those vertices that are mirrored (maybe by typing in the uv coordinate to a 5th decimal?) and check, but if i get the same normal, what else could be the reason?

    *edit*

    I checked, i tracked down the same vertex, and they do have the same normal, 0, 0.95... , 0.288,

    tangent though, has the same values, except that 0 is not 0, but a very small number
     
    Last edited: Jul 9, 2013
  33. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    The tangent may be different on either side of the split because it's based on different vertex UVs/positions, but I don't think that will matter if they're the same or not.

    Are the tangents set to calculate, normals set to import and smoothing angle set to 180 (it will be grayed out if the tangents are set to import)?
     
  34. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Yes :(

    i can email you the model, i just need to isolate it, prepare it
     
  35. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    $tan.jpg

    These two are the same vert, and have the same normals. The tangent should be the same, should it not? for some reason, the one belonging to the mirrored side, is slightly off.
     
  36. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    If it's the same vert (as in, the same vertex index - I think that's what you're outputting in the debug log) then it will reference the exact same data... so yeah, it should be identical.

    I'm not sure why that's happening :(

    Out of interest, have you tried Handplane? If you bake out an object space map and save it with a high precision file format (16bit PNG works fine) and feed it into Handplane... do you get the same issue with the resulting map?
     
    Last edited: Jul 9, 2013
  37. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    I just tried it and i got it even weirder. Gotta get some sleep and then make some sense out of all of this.
     
  38. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    Quite surprisingly it worked. And this is straight out of max, through hp, looking great in unity. No need to use xnormal now, if hp does the same conversion?

    The seam is technically still there, but almost impossible to pick up, i had to crank the levels up to make it obvious.

    $hp3.jpg $hp4.jpg $hp5.jpg

    This is how i ended up mapping it, $fender.jpg
     
    Last edited: Jul 10, 2013
  39. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Yeah, Handplane's probably the best thing to use these days. Luke's a proper programmer and understands all this stuff far better than I do - I'm an artist really, and pretty much copy/pasted the code Aras gave us into an xNormal plugin :p

    Even I don't use my xNormal plugin these days - I just throw everything through Handplane.

    Still not sure why there's a difference...
     
  40. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    :'(


    (i'ts a tear)

    I started running some tests from scratch, the success i thought i've made so far, just went down the drain.

    First thing first, i'm now trying with hand plane, and i've started with a simple box. I'm still using your debug shader but i've made something more elaborate in the meantime, normalizing the lightdir in all of them. It doesnt work on this any more. But what i find most confusing is that even when i omit that line, where it gets normalized, it still looks wrong. Do you have any idea why this is?

    $boxes_shader_handplane01.jpg

    $boxes_shader_handplane02.jpg

    $boxes_shader_handplane03.jpg
     
    Last edited: Jul 11, 2013
  41. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    The previous model:

    $hubcap01.jpg
    $hubcap02.jpg
    $hubcap03.jpg



    the shader (from a few pages back, with added specular)

    Code (csharp):
    1.  
    2. Shader "Debug/Normal Map Only" {
    3.  
    4.     Properties {
    5.  
    6.         _Color ("Main Color", Color) = (1,1,1,1)
    7.         _SpecColor ("Specular Color", Color) = (0.5, 0.5, 0.5, 1)
    8.         _Shininess ("Shininess", Range (0.03, 1)) = 0.078125
    9.         _MainTex ("Diffuse (RGB) Alpha (A)", 2D) = "white" {}
    10.         _BumpMap ("Normal (Normal)", 2D) = "bump" {}
    11.     }
    12.  
    13.  
    14.     SubShader{
    15.         Tags { "RenderType" = "Opaque" "Queue" = "Geometry" }
    16.    
    17.  
    18.         CGPROGRAM
    19.  
    20.  
    21. //            #pragma surface surf NormalsOnly2 exclude_path:prepass
    22.             #pragma surface surf NormalsOnly2
    23.             #pragma target 3.0
    24.            
    25.             fixed4 _Color;
    26.             float _Shininess;
    27.            
    28.             struct Input
    29.             {  
    30.                
    31.                 float2 uv_MainTex;
    32.             };
    33.            
    34.             sampler2D _MainTex, _BumpMap;
    35.            
    36.             void surf (Input IN, inout SurfaceOutput o)
    37.             {
    38.                 o.Albedo = _Color.rgb;
    39.                 o.Gloss = _Color.a;
    40.                 o.Specular = _Shininess;
    41.                 o.Normal = UnpackNormal(tex2D(_BumpMap, IN.uv_MainTex));
    42.             }
    43.  
    44.             inline fixed4 LightingNormalsOnly2 (SurfaceOutput s, fixed3 lightDir, fixed3 viewDir, fixed atten)
    45.             {
    46. //                lightDir = normalize(lightDir);
    47.                 fixed NdotL = dot(s.Normal, lightDir);
    48.                 half3 h = normalize (lightDir + viewDir);
    49.                 fixed diff = max (0, dot (s.Normal, lightDir));
    50.                 float nh = max (0, dot (s.Normal, h));
    51.                 float spec = pow (nh, s.Specular*128.0) * s.Gloss;
    52.                
    53.                 fixed4 c;
    54.                 c.rgb = (s.Albedo * _LightColor0.rgb * diff + _LightColor0.rgb * _SpecColor.rgb * spec) * (atten * 2);
    55.  
    56.                 c.a = 1.0;
    57.                 return c;
    58.             }
    59.  
    60.  
    61.         ENDCG
    62.  
    63.     }
    64.  
    65.     FallBack "VertexLit"
    66.  
    67. }
    I am going quite nuts over this :( I am trying to learn, i think i have a good understanding of what is going on but i think im missing some pieces. I know enough math to know that a cross product gives a perpendicular vector, i understand that there are precision issues when calculating these... i don't quite understand what happens with normal maps when they get packed into just two channels, and i don't quite understand what actually happens with the normals (matrix multiplies a pixel value?).

    I've been going through some tutorials but i couldnt get the vertex/fragment shader to reproduce the same normals, even though it does kinda work with the normals.

    I'd be happy if i could establish a workflow and some proof of concept somehow, as in, if i map, then bake, then get the model into marmoset or use 3point shaders, i am getting pretty good results, but can't seem to reproduce it in unity.
    I don't quite understand the baking/modeling techniques that i see being used to fix artifacts since none of it translates to unity, i.e. none of the fixes seem to be fixing anything, On the other hand, when i work in max i get really good results half way through using 3p shader, so i can't figure out why are the "hacks" being used (like additional loops on harder edges, etc). I

    Every day i'm under the impression that i've figured it out, and then i try something new and undo all the progress i've made so far.

    If any sense could be made out of these images and shaders, i'd really appreciate the help. I'm going to move to polycount forums, since i just saw that you're pretty active there, and the guys who made hand plane seem to know a lot on this subject.

    I'm not completely retarded, i think i'm trying hard, but i just can't seem to make progress :(
     
  42. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    This is simply a limitation of the forward rendering method. I've been trying to work out why this is happening - something to do with the light direction being calculated per-vertex an interpolated, rather than per-pixel in deferred. Still trying to figure out a reliable (and hopefully easy) way to get around it, though :/

    Unity's built-in shader looks perfect because it's running in deferred rendering (i.e. it has a _PrePass function in the surface shader that lets it be handled by the deferred renderer, the shader code you're using there doesn't so it's being drawn in forward rendering). If you switch to Forward rather than Deferred, you'll see Unity's own shaders also have the same issue.
     
  43. pailhead

    pailhead

    Joined:
    Oct 8, 2012
    Posts:
    68
    I'll re-read the entire thread tomorrow.

    I think it's starting to make more sense, the bit with the prepass in the shader i didn't quite understand so i'll dig more into the rendering methods.

    Thank you so much for the feedback, you rule!
     
    Last edited: Jul 11, 2013
  44. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Each shader has the option of having up to 3 different lighting functions...

    One for forward rendering - which has no suffix.
    One for deferred rendering - which has the suffix _PrePass.
    One for directional lightmaps - which has the suffix _DirLightmap.

    So the built-in Lambert shader (regular Diffuse) has it's lighting functions look like this;

    Code (csharp):
    1. inline fixed4 LightingLambert (SurfaceOutput s, fixed3 lightDir, fixed atten)
    2. {
    3.     fixed diff = max (0, dot (s.Normal, lightDir));
    4.    
    5.     fixed4 c;
    6.     c.rgb = s.Albedo * _LightColor0.rgb * (diff * atten * 2);
    7.     c.a = s.Alpha;
    8.     return c;
    9. }
    10.  
    11.  
    12. inline fixed4 LightingLambert_PrePass (SurfaceOutput s, half4 light)
    13. {
    14.     fixed4 c;
    15.     c.rgb = s.Albedo * light.rgb;
    16.     c.a = s.Alpha;
    17.     return c;
    18. }
    19.  
    20. inline half4 LightingLambert_DirLightmap (SurfaceOutput s, fixed4 color, fixed4 scale, bool surfFuncWritesNormal)
    21. {
    22.     UNITY_DIRBASIS
    23.     half3 scalePerBasisVector;
    24.    
    25.     half3 lm = DirLightmapDiffuse (unity_DirBasis, color, scale, s.Normal, surfFuncWritesNormal, scalePerBasisVector);
    26.    
    27.     return half4(lm, 0);
    28. }
    Unity will try and match the selected rendering method to one of these functions, falling back to forward if it can't (or falling back to the shader's FallBack if none of them will work)...
    If you're in deferred, it'll use the _PrePass function.
    If you're in deferred but there isn't a _PrePass, it'll use the regular forward function instead and render it in forward mode, after the other deferred stuff has rendered (if there is any).
    If you're in forward, it'll stick with the regular forward function.
     
  45. StaffanEk

    StaffanEk

    Joined:
    Jul 13, 2012
    Posts:
    380
    This whole thread is ridiculous. Has Unity implemented mikktspace tangent basis yet? That would solve all of this retardation.
     
  46. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    They have not.
     
  47. Noisecrime

    Noisecrime

    Joined:
    Apr 7, 2010
    Posts:
    2,054
    Can't believe UT haven't implemented Mikktspace in Unity yet as an option, would seem to be a no-brainer, especially now that PBR is gaining popularity and requires a good normalmap and thus correct tangent space.

    Anyway in the meantime, those still with issues might want to take a look at HandPlane . Not sure how long its been around, I just learnt about it from a 92 page google doc about normal mapping from games which might also be a useful read.
     
  48. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    The xNormal plugin still works fine. You can use that or Handplane, same results.
     
    Noisecrime likes this.
  49. Lex-DRL

    Lex-DRL

    Joined:
    Oct 10, 2011
    Posts:
    140
    @Farfarer
    Just wanted to clarify one more thing.
    Is your tangent basis used for ObjSpace <=> TangSpace NM converter? Obviously, if it's selected in xN settings.
     
  50. Farfarer

    Farfarer

    Joined:
    Aug 17, 2010
    Posts:
    2,249
    Yes, it'll use whatever tangent basis you have selected for the main baker.
     
    Lex-DRL likes this.