What happens to normals and tangents between Unity and the shader?

I’m packing three floats into one as per http://forum.unity3d.com/threads/can-i-send-data-as-w-via-vertex-data.114111/#post-769621.

If I store the resulting float in normal.x or tangent.x, the data comes out screwy (I think there’s some kind of bit shifting issue, since the result definitely contains something relating to my original data, but not sure exactly what is going on). But if I put it in uv.x or tangent.w, for example, it works great.

I already turned off auto-normalisation of normals.

Something must be happening to the values in normal and tangent on their way to the shader that’s affecting the maths of unpacking the floats. Anyone know what it might be?

Here’s my shader:

`Shader “Starboretum/Flora2” {

	Properties {
	}
	SubShader {
		Tags {
			"Queue" = "Geometry"
			"RenderType" = "Opaque"
		}
		
	Cull Off
	CGPROGRAM

	#pragma surface surf NoLighting vertex:vert addshadow 
	#pragma glsl_no_auto_normalization
	#include "UnityCG.cginc"

	float _IterationProgress;

	struct appdata {
	    float4 vertex : POSITION;
	    float3 normal : NORMAL;
	    float4 tangent : TANGENT;
	    float4 texcoord : TEXCOORD0;
	    float4 texcoord1 : TEXCOORD1;
	    fixed4 color : COLOR;
	};


    struct Input 
    {
    	float4 color : COLOR;
    };
    
    
    fixed4 LightingNoLighting(SurfaceOutput s, fixed3 lightDir, fixed atten)
    {
        fixed4 c;
        c.rgb = s.Albedo; 
        c.a = s.Alpha;
        return c;
    }


	void surf (Input IN, inout SurfaceOutput o) 
	{
		o.Albedo = (half3)IN.color.rgb;
	}


	float3 UnPackVector3(float src) 
	{	 
	    return (frac(float3(1.0f, 256.0f, 65536.0f) * src) * 2) - 1;	 
	}
	
	
	void vert (inout appdata_full v)
	{
		// get the vertex position
		float4 position = mul(_Object2World, v.vertex);
		
		// obtain the direction of this segment
		float3 dir = UnPackVector3(v.texcoord1.x);
		
		float3 camDir = _WorldSpaceCameraPos - position.xyz;

		// widen the stem from prev width -> width
		float width = lerp(v.texcoord.y, v.texcoord1.y, _IterationProgress);

		// get a vector perpendicular to the direction to the camera
		float3 left = normalize(cross(dir, camDir));
		
		left *= v.texcoord.x; // reverse it if necessary
		
		//  expand the line segment by the width
		v.vertex.xyz += left * width;
			
	    v.color = float4(lerp(v.color.rgb, v.tangent.xyz, _IterationProgress),1);
	}
	
	ENDCG
		
	} 
	
	FallBack "VertexLit"
}

`

Apparently this is due to dynamic batching, though the actual reason the data gets mangled remains unclear.

http://forum.unity3d.com/threads/passing-extra-vertex-data-to-a-shader.190448/

I have found before that when things are dynamically batched, they all get multiplied by their individual model matrices before being sent to the shader, so that the shader can treat them all as a single model (with a model matrix just set to the identity matrix). That naturally messes with the position, normal and tangent coordinates, but any texture vertices remain unchanged, so you can either use those to send your data, or do the dynamic batching manually for anything affected by this shader (i.e. write a script to put all the affected meshes into a single mesh yourself).