Search Unity

  1. Megacity Metro Demo now available. Download now.
    Dismiss Notice
  2. Unity support for visionOS is now available. Learn more in our blog post.
    Dismiss Notice

Direct3D Rasterizer Errors in Unity?

Discussion in 'Shaders' started by drudiverse, Aug 18, 2014.

  1. drudiverse

    drudiverse

    Joined:
    May 16, 2013
    Posts:
    218
    Edit : I figured it out. graphics can't do angles with irrational numbers. sry unity. solution given in the end.

    I have fragments missing at triangle seams, even though graphic card specifications guarantee total accuracy of the rasterization pipeline.

    Graphics cards are mathematically coarse and they sometimes round pixels inaccurately at the seams. However, it sais online that rasterization is totally error-free on todays graphics cards: if triangles share the same borders, the Direct3D implementation can guarantee that there are no pixel errors through "rasterization rules" as specified on msdn.

    Direct3D uses top-left rasterization, it means that lines scan across the screen and find where a pixel is on a triangle, and when a pixel is on a triangle edge line, the graphics pipeline does additional tests to make sure that the line cannot render zero or two pixels, it draws just one.

    Antialiasing can partially fix the seam problem in unity, but professional games don't require antialiasing to correct pixel errors in mesh seams, the error pixels shouldnt happen.

    The question is: How can Unity produce pixel rendering errors at seams if it has the same rasterization as other game engines using D3D?

    the mesh has shared vertices that cannot feasibly be wrong/different values, i rounded them to be the same using mathf.round, and they come from the same value using the same maths function. I am told that i should print their bitwise version to be more certain that it's the same vertex, that seems crazy to me! the directx rasterizer is flawed, how is it possible ?

    http://fgiesen.wordpress.com/2011/07/06/a-trip-through-the-graphics-pipeline-2011-part-6/
    Untitled.jpg
     
    Last edited: Aug 19, 2014
  2. drudiverse

    drudiverse

    Joined:
    May 16, 2013
    Posts:
    218
    Yeah right i think i have figured this out. There are some forum discussions and rasterizer guidelines that talk about the possibility of graphics errors, and none of them mention that graphics cards can't technically rasterize properly for distances further that 10 paces of world space... at 10 worldspace you will get 5 pixels missing every 100 frames and at 50k worldspace it's 30 pixels every 10 frames and at 99k the seams are relatively seethrough.

    So where is the error coming from? I think that unity vertices are more precise than graphics card ones, so in all probability the graphics card is giving back bitcrused versions of wobbly lines on our polygon edges. Add to that the msdn rasterizer rules unless there are obscure configuration rules to change them, tend to put a minimum number of pixels on screen, if there was a lenient margin of 10% more pixels on triangle edges, the seams wouldn' t appear.

    also, the triangles translate within the camera to every angle so the vertices go to floating points with 32 decimal places all the time.

    i dont think single float precision can even show a seamed cube at 45 degrees at 50 000:p worldspace. So it's it's a reference that should be more known online, it isn't mentioned: I was all beleiving graphics cards guarantee this and that.. nonsense :D

    so the only solution is extra geometry / putting a dx11 tesselated mesh under the map, and tesselation of map.

    So sorry about writing a long post like Graphic cards guarantee no raster errors, i read it online and it's erroneous :)))) morse code.jpg
     
    Last edited: Aug 20, 2014
  3. drudiverse

    drudiverse

    Joined:
    May 16, 2013
    Posts:
    218
    Did anyone understand msdn rasterization rules? perhaps there are different options guiding raster interpolation defined by Microsoft that can be available to game makers?

    If graphics card makers or driver makers can add things that actually make no errors, the errors are all pixel wide so, If it's possible to specify different subsets of rules to control raster options, if it's possible to let game designers control the rendering option in d3d if there are any, what we'd be seeing is more lenient pixel rules, where an entire single pixel extra is sometimes drawn at triangle edges, and there wouldnt be any visible pixel rounding errors.
     

    Attached Files:

    Last edited: Aug 20, 2014