Search Unity

Cloud System Simple Billboard Shader, position issue Matrix MVP Woes

Discussion in 'Shaders' started by Matthew Scott, Dec 9, 2013.

  1. Matthew Scott

    Matthew Scott

    Joined:
    Jan 16, 2013
    Posts:
    33
    Hi all! :D I've been developing an atmosphere system and so far have managed to create some decent looking clouds made entirely procedurally, I am using my own particle system.

    $Cloud.jpg
    :-o

    Originally, the generated billboards (which all share the same material/shader procural texture (from a texture atlas) so they are batched) were oriented to face the camera CPU side via a script as well as setting vertex colours. I really wanted to have a screen space based orientation to solve the problems of axis flipping as you look down over the top of clouds while moving. Eventually, I made the decision to move these calculations to GPU, letting a shader take care of it.

    I started by using the Unity ready made Particles/PreMultiply shader. Then, following this tutorial found here:

    http://en.wikibooks.org/wiki/Cg_Programming/Unity/Billboards

    I began to integrate this logic into the premultiply shader. But I seem to be having some issues with the MVP, although you can see where the particles are located, the actual imagery is drawn elsewhere.

    $Issue.jpg

    Before I get to the actual code, I originally assumed this was a problem with scaling the particles before they are sent to the GPU, which was done after mesh creation upon turning each mesh into a prefab. I removed this code and have now realized that the scale must be set in the shader if I am not mistaken. However, this has not resolved this issue. :|

    Another theory was that is was due to the fact that each particle was set to be a child object of a single game empty, and perhaps that was somehow causing confusion in the shader. So, I removed the code that does this and now each particle is simply its own object again, this however, has still not resolved this issue. :(

    My last guess, which I can only assume to be the issue after much research is that dynamic batching has a part to play in this. It's obviously vital that the particles need to remain batched or there really is no point to the whole thing.

    Other than that, it must have something to do with "unity_Scale.w"...which I don't entirely understand where that's pulling values from. :confused:

    I feel as though I am on the right track here but I need a helping hand. Does anybody have an (easy to understand) explanation as to why this is occurring? and how I might go about resolving this? Admittedly, I still have a ways to go in shader experience, but I understand a fair amount. :rolleyes:

    I'm not too worried about scaling entirely, I know that I can either keep the original mesh scale and change it dynamically in the shader by multiplying v.vertex.x or y by a _Scale amount. OR, I could just scale the original mesh during it's creation. Though I think the first option is better if I'm honest, I'd rather not have to rebuild and array of meshes every time I want to change the scale.

    Any help or guidance here would be greatly appreciated, I've been pulling my hair out for day's battling with this. It seems crazy to me that I'm having such a big problem with such a small modification to such a simple shader! :D

    ==================================================================

    Here is the shader code, with any irrelevant part's //commented out. This all looks correct to me, so I'm a bit confused...

    Code (csharp):
    1.  
    2. Shader "Particles/UA Cloud Shader" {
    3.     Properties {
    4.         _MainTex ("Particle Texture", 2D) = "white" {}
    5.         _InvFade ("Soft Particles Factor", Range(0.01,3.0)) = 1.0
    6.     }
    7.  
    8.     Category {
    9.         Tags { "Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent" }
    10.         Blend One OneMinusSrcAlpha
    11.         ColorMask RGB
    12.         Cull Off Lighting Off ZWrite Off Fog { Mode Off }
    13.  
    14.         SubShader {
    15.             Pass {
    16.            
    17.                 CGPROGRAM
    18.                 #pragma vertex vert
    19.                 #pragma fragment frag
    20.                 #pragma multi_compile_particles
    21.  
    22.                 #include "UnityCG.cginc"
    23.  
    24.                 sampler2D _MainTex;
    25.                 //fixed4 _TintColor;
    26.                 float _Scale;
    27.                
    28.                 struct appdata_t {
    29.                     float4 vertex : POSITION;
    30.                     fixed4 color : COLOR;
    31.                     float2 texcoord : TEXCOORD0;
    32.                 };
    33.  
    34.                 struct v2f {
    35.                     float4 vertex : POSITION;
    36.                     fixed4 color : COLOR;
    37.                     float2 texcoord : TEXCOORD0;
    38.                     //#ifdef SOFTPARTICLES_ON
    39.                     //float4 projPos : TEXCOORD1;
    40.                     //#endif
    41.                 };
    42.                
    43.                 float4 _MainTex_ST;
    44.  
    45.                 v2f vert (appdata_t v)
    46.                 {
    47.                     v2f o;
    48.                     o.vertex = mul(UNITY_MATRIX_P,
    49.                                 mul(UNITY_MATRIX_MV, float4(0.0, 0.0, 0.0, 1.0)) +
    50.                                 float4(v.vertex.x, v.vertex.y, 0.0, 0.0));
    51.                     //#ifdef SOFTPARTICLES_ON
    52.                     //o.projPos = ComputeScreenPos (o.vertex);
    53.                     //COMPUTE_EYEDEPTH(o.projPos.z);
    54.                     //#endif
    55.                     o.color = v.color;
    56.                     o.texcoord = TRANSFORM_TEX(v.texcoord,_MainTex);
    57.                     return o;
    58.                 }
    59.  
    60.                 //sampler2D _CameraDepthTexture;
    61.                 //float _InvFade;
    62.                
    63.                 fixed4 frag (v2f i) : COLOR
    64.                 {
    65.                     //#ifdef SOFTPARTICLES_ON
    66.                     //float sceneZ = LinearEyeDepth (UNITY_SAMPLE_DEPTH(tex2Dproj(_CameraDepthTexture, UNITY_PROJ_COORD(i.projPos))));
    67.                     //float partZ = i.projPos.z;
    68.                     //float fade = saturate (_InvFade * (sceneZ-partZ));
    69.                     //i.color.a *= fade;
    70.                     //#endif
    71.                    
    72.                     return i.color * tex2D(_MainTex, i.texcoord) * i.color.a;
    73.                 }
    74.                 ENDCG
    75.             }
    76.         }
    77.     }
    78. }
    79.  
     
    Last edited: Dec 9, 2013
  2. Matthew Scott

    Matthew Scott

    Joined:
    Jan 16, 2013
    Posts:
    33
    Well, dynamic batching is definitely the culprit here, disabling it causes the particles to be drawn in the correct position, but this can't be an impossible feat, after all, doesn't Unity's own particle system dynamically batch it's particles? :confused:
     
  3. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    Well this is something where my shader knowledge stops but I guess the problem is follows.

    What you do is you project the mesh in MV-Space (camera space?) and then you offset the vertecies by their object-position. Then you project it in screen space.

    The problem with batching is, that your world space will become object space because all of the planes are only one mesh now basically. They all use the same root but different object positions. This is because the final image will have no depth and you only see a difference if you alter x and y position of the plane.

    I would guess the root of batched objects is just zero because if you put one plane to (0,0,0) this one will work. One solution would be to know the world position of the individual particle and parse it to the shader by a constant value. But maybe this can be easily done with some other projection magic.
     
  4. Matthew Scott

    Matthew Scott

    Joined:
    Jan 16, 2013
    Posts:
    33
    That was my theory but it seems a bit excessive to have to constantly send the original world positions of every particle to the shader, but I'm sure there has to be an easier workaround than that...I have bill-boarding working great on the CPU atm, but I really would like to figure out how to get it working here.

    By the way, does anybody know generally whether performing vertex movements/vertex color lerps on the GPU is faster than the CPU? Just in general?
     
  5. McDev02

    McDev02

    Joined:
    Nov 22, 2010
    Posts:
    664
    Anything like that is faster within the GPU, in general. Performing vertex and pixel operations is what it was made for :)
     
  6. Matthew Scott

    Matthew Scott

    Joined:
    Jan 16, 2013
    Posts:
    33
    Dammit Jim! We need a solution! I hate writing code when I KNOW there's a more efficient way of doing it...

    Would anybody else with some serious shader know how care to shed some light on this situation?
     
  7. Ferb

    Ferb

    Joined:
    Jan 4, 2014
    Posts:
    25
    I'm having this problem too. It seems that when dynamic batching, Unity premultiplies the vertices by the model matrix before sending them to the shader, which can cause various problems if for various reasons you don't want that to be done (in my case, because I found it easier to do the math of making certain dynamically created meshes in world-space). There's no way to undo it once you're in the shader - within the shader, the model matrix is just an identity matrix if dynamic batching is being used, so are _Object2World and _World2Object.

    In the case above, I'd suggest uploading the position coordinates as a second texture coordinate - I don't think they get premultiplied, just positions, tangents and normals.

    Sending position as a texture coordinate worked for me, but my normals will still be getting corrupted, so I might just go and do the math and put the mesh back in object space I guess. But it's still annoying to know that Unity is doing all this work every frame that could be avoided if there was just some shader tag to switch this behaviour off while keeping dynamic batching.