I'm using appdata's v.normal to dynamically compute UVs but it seems in some circumstances (namely zooming out very far) this gets turned into a world normal. Am I missing something? I can't even figure out what could be causing this. In code term, this: Code (CSharp): void vert (inout appdata_tan v, out Input o) { o.vertex = v.vertex; o.customNormal = v.normal; } Turns into this : Code (CSharp): o.customNormal = mul( _Object2World , float4( v.normal, 0.0 ) ).xyz; Here's a video of the problem in action. When the mouse leaves the video I'm setting the shader to use world normals by hand. Thing's I've considered: I don't have a fallback shader so it shouldn't be an LOD issue. Unity's Lighting doing funky things with normals? This issue doesn't seem to effect a fragment shaders...
Update: narrowed it down. I'm thinking of submitting a but report, can anyone confirm this? http://forum.unity3d.com/threads/shaderlab-v-normals-bug.259058/
That could be the case, duplicating materials and/or disabling close by objects seems to work. Thanks! I assume they get messed up due to Unity reusing the Object2World matrix when batching is done... Are there any alternatives to this for determining the world Normal (preferably that wouldn't suffer from the same problem)? Code (CSharp): o.customNormal = mul((float3x3)_Object2World, v.normal);