Search Unity

Please help me with my transformations

Discussion in 'Shaders' started by Alex_curiscope, Aug 9, 2017.

  1. Alex_curiscope

    Alex_curiscope

    Joined:
    Apr 4, 2017
    Posts:
    55
    Hello!

    I have a compute shader that computes a volumetric fog volume (following this idea).

    I pass in the inverse of the MVP matrix from the script, then get the clip space position of each cell and multiply it by that, then divide by w. It's not coming out right and I'm not sure where I'm going wrong. Can anyone help?

    Here's the relevant bit of the script:
    Code (CSharp):
    1.         Matrix4x4 invProj = (Camera.main.projectionMatrix * Camera.main.worldToCameraMatrix).inverse;
    2.         shader.SetVector("m0", invProj.GetRow(0));
    3.         shader.SetVector("m1", invProj.GetRow(1));
    4.         shader.SetVector("m2", invProj.GetRow(2));
    5.         shader.SetVector("m3", invProj.GetRow(3));
    6.         shader.Dispatch(kernelIndex, 128 / 4, 128 / 4, 128 / 4);
    Here's the relevant bit of the compute shader:
    Code (csharp):
    1. float4 m0;
    2. float4 m1;
    3. float4 m2;
    4. float4 m3;
    5.  
    6. [numthreads(4,4,4)]
    7. void CSMain (uint3 id : SV_DispatchThreadID)
    8. {
    9.     float texZSize = 128;
    10.     float texXSize = 128;
    11.     float texYSize = 128;
    12.  
    13.     float4x4 _ViewProjectInverse;
    14.     _ViewProjectInverse[0] = m0;
    15.     _ViewProjectInverse[1] = m1;
    16.     _ViewProjectInverse[2] = m2;
    17.     _ViewProjectInverse[3] = m3;
    18.  
    19.     // get the cell position in normalised device coordinates
    20.     float3 cellPosition = float3(
    21.         (((id.x + 0.5) / texXSize) - 0.5) * 2,
    22.         (((id.y + 0.5) / texYSize) - 0.5) * 2,
    23.         (id.z + 0.5) / texZSize
    24.         );
    25.  
    26.     // compute world space position of cell centre from clip pos
    27.     float4 wp = mul(_ViewProjectInverse, cellPosition);
    28.     wp /= wp.w;
    29.  
     
    Last edited: Aug 9, 2017
  2. jvo3dc

    jvo3dc

    Joined:
    Oct 11, 2013
    Posts:
    1,520
    Yeah, that doesn't work. Model and view matrices can just be used inverted, but for the projection matrix it's a bit different.

    If you have the distance to the camera, it's fairly easy to just create a ray with that distance from the camera in world space to determine the world space position. That is how I typically do it, but in a fragment shader. (Edit: Instead of a compute shader.)

    For the ray direction it is best to use a script to send the directions in two corners of the projection matrix with a fixed z offset of 1 meter for example. You can then just interpolate those and normalize them.
     
    Last edited: Aug 15, 2017
  3. Alex_curiscope

    Alex_curiscope

    Joined:
    Apr 4, 2017
    Posts:
    55
    Awesome, thanks. I'll try it that way. I'm doing pretty much that for an image effect anyway.
     
  4. Alex_curiscope

    Alex_curiscope

    Joined:
    Apr 4, 2017
    Posts:
    55
    This worked a charm btw. It's even very easy to support single pass stereo VR if you use two frustums and in non-stereo modes just send the centre of each frustum edge for the inner frustum corners.

    One tricky part that stumped me for a bit was that ComputeShader.SetFloats expects your float arrays to be float4. If you declare float3 vecArray[4]; in your compute shader, you'll get bad data if you send it data of the format xyzxyzxyzxyz. Just use float4s and send float4s and you're good.