Search Unity

Appdata's v.normal is changing?

Discussion in 'Shaders' started by crushy, Jul 25, 2014.

  1. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    I'm using appdata's v.normal to dynamically compute UVs but it seems in some circumstances (namely zooming out very far) this gets turned into a world normal. Am I missing something? I can't even figure out what could be causing this.

    In code term, this:
    Code (CSharp):
    1. void vert (inout appdata_tan v, out Input o) {
    2.   o.vertex = v.vertex;
    3.   o.customNormal = v.normal;
    4. }
    Turns into this :
    Code (CSharp):
    1. o.customNormal = mul( _Object2World , float4( v.normal, 0.0 ) ).xyz;

    Here's a video
    of the problem in action. When the mouse leaves the video I'm setting the shader to use world normals by hand.

    Thing's I've considered:
    • I don't have a fallback shader so it shouldn't be an LOD issue.
    • Unity's Lighting doing funky things with normals? This issue doesn't seem to effect a fragment shaders...
     
  2. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
  3. mouurusai

    mouurusai

    Joined:
    Dec 2, 2011
    Posts:
    350
    Could it be that object become a part of dynamic batching?
     
  4. crushy

    crushy

    Joined:
    Jan 31, 2012
    Posts:
    35
    That could be the case, duplicating materials and/or disabling close by objects seems to work. Thanks! :)

    I assume they get messed up due to Unity reusing the Object2World matrix when batching is done... Are there any alternatives to this for determining the world Normal (preferably that wouldn't suffer from the same problem)?
    Code (CSharp):
    1. o.customNormal =  mul((float3x3)_Object2World, v.normal);