Search Unity

Need help starting out writing a geometry shader

Discussion in 'Shaders' started by 5argon, Jun 15, 2017.

  1. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    I am new to writing shaders. First I want to confirm is what I want solved by writing a geometry shader? If so, I can go on and learn about it.

    First, this is what I want. The model is the white and red thing, but the grey and white outline is currently achieved by duplicating that model, scale up a bit, then apply a shader that renders everything in grey and white. (With render order behind the main ones) The current approach requires 3 non-batchable draw calls, and 3 times vertex amount.



    I want to reduce the Tris count in the game, so I thought about making a special kind of shader that renders 2 passes on just that main model. The first pass would be the outline, where it would expand by the direction of normals. (if possible, I want to do 2 color at the same time in this one pass. Maybe via a threshold parameter how much would the grey part will be on the white. In this picture, it would be about 30%) The second pass would be just a normal diffuse shader.

    I would like to get some starts on what I need to be able to achieve this. From what I understand, I have to write a "Surface shader" (according to the guide) because the main part interacts with lighting? And with HLSL?
     
    Last edited: Jun 15, 2017
  2. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    A few things:
    • This doesn't need a geometry shader, just a multi-pass shader.
    • Using a geometry shader (or multi-pass shader) doesn't decrease the vertex count. The same number of vertices will be rendered either way, and a geometry shader might even increase that count.
    • Expanding by vertex normal is harder than it sounds. It works well for smooth geometry, but it'll expose holes in hard edged geometry.
    ShaderLab is essentially straight HLSL inside of a larger file with information on how to display properties inside the editor and other rendering information not usually present in straight HLSL or GLSL, though it used to be CG in the early days. If you want you could write straight GLSL inside of the .shader file by using GLSLPROGRAM / ENDGLSL instead of CGPROGRAM / ENDCG, but there's usually not a reason to do this as Unity does a very good job of translating HLSL to GLSL and other shader languages, but won't convert GLSL to HLSL.
     
    5argon likes this.
  3. 5argon

    5argon

    Joined:
    Jun 10, 2013
    Posts:
    1,555
    Thank you, I have tried something and indeed it does not equal to "scaling up" at all. For one, the vertex became disconnected. The shader views each vertex separately and have no idea about what the whole model is after all. If I could not find a way to emulate scaling up via shader I will probably have to stick to my current approach.
    Screenshot 2017-06-16 02.03.04.png
     
  4. bgolus

    bgolus

    Joined:
    Dec 7, 2012
    Posts:
    12,343
    Scaling up isn't too difficult to do in a shader, but it's made more difficult by Unity's dynamic batching as you can't guarantee the shader knows the mesh's center as batching works by combining multiple meshes on the CPU and setting the "mesh center" to the world center. Expanding by the "normals" can be solved by storing an averaged normal in the mesh tangents or an extra UV. There are a couple of tools on the asset store that do this, usually for toon outlines or displacement, both of which need to do "normal" pushing with out showing seams.

    However I would suggest going a completely different direction and building out your objects as a single mesh with the outlines built in to the mesh. You could either use multiple materials on the mesh, or a single shader with information stored in the UVs or vertex colors that the shader can use to change how it behaves which would let you reduce the game object count and draw calls. Unless you're rendering tens of thousands of these objects, I expect you aren't hitting vertex count limits on even old android hardware (though I don't know what the rest of your scene looks like). For example first generation GearVR titles had an approximate limit of somewhere around 200k vertices, and that's to keep a constant 60 fps on a mobile phone.

    The other benefit with using a custom mesh for the outlines is you could use smoothed normals and then the "push" would work if you need dynamic outline widths!
     
    5argon likes this.