Why are my geometry shader output positions not rotated by object rotation?

Hello,

I’ve been struggling for the last couple of days with my geometry shader. It’s meant to take a vertex position, and from that produce a square facing a particular direction in object space. It works great until I rotate my game objects. The squares always have the same orientation in world space when I’d like them to rotate with my object.

The vertex and geometry programs as well as their inputs are shown below.
Can anyone spot the problem?
Thanks.

/* --------------- */
/* Data structures */
/* --------------- */
struct VSData
{
	half4 Pos	    : POSITION;
	half2 UVStart	: TEXCOORD0; 
	half2 UVLen	  : TEXCOORD1;
	half2 TileLen	: TEXCOORD2; 
	float2 SizeDir   : TEXCOORD3; 
};

struct GSData
{
	half4 Pos   : POSITION;
	half4 UV    : TEXCOORD0;
	half4 Face  : TEXCOORD1;
};

/* ------------- */
/* Vertex Shader */
/* ------------- */
GSData VS_Main(VSData v)
{
	GSData output = (GSData)0;

	output.Pos = v.Pos; // <- No Transform. Transform applied in geometry shader.
	output.UV = half4(v.UVStart.xy, v.UVLen.xy);
	output.Face = half4(v.TileLen.xy, v.SizeDir.xy);

	return output;
}

/* --------------- */
/* Geometry Shader */
/* --------------- */
[maxvertexcount(4)]
void GS_Main(point GSData p[1], inout TriangleStream<FSData> triStream)
{
	half4 pos = p[0].Pos; <- this is the position passed in by the vertex shader.
	float halfSize = 2.0;
	half3 right = half3(1,0,0);
	half3 up = half3(0,1,0);

	// define the corner positions of this square
	half4 v[4];
	v[0] = half4(pos - (halfSize * right) - (halfSize * up), 1.0f); // 1 left bottom
	v[1] = half4(pos - (halfSize * right) + (halfSize * up), 1.0f); // 2 left top
	v[2] = half4(pos + (halfSize * right) - (halfSize * up), 1.0f); // 3 right bottom
	v[3] = half4(pos + (halfSize * right) + (halfSize * up), 1.0f); // 4 right top

	// create the vertices, passing in everything fragment shader needs to calculate fragment UV pos
	FSData o;
	o.Pos = mul(UNITY_MATRIX_MVP, v[0]); // left bottom <- THIS POSITION IS NOT ROTATED WITH OBJECT. WHY?
	o.UV  = p[0].UV;						
	o.Len = half4(p[0].Face.xy, 0, 0);	
	triStream.Append(o);

	o.Pos = mul(UNITY_MATRIX_MVP, v[1]); // left top <- THIS POSITION IS NOT ROTATED WITH OBJECT. WHY?
	o.UV  = p[0].UV;
	o.Len = half4(p[0].Face.xy, 0, 1);
	triStream.Append(o);

	o.Pos = mul(UNITY_MATRIX_MVP, v[2]); // right bottom <- THIS POSITION IS NOT ROTATED WITH OBJECT. WHY?
	o.UV  = p[0].UV;					
	o.Len = half4(p[0].Face.xy, 1, 0);
	triStream.Append(o);

	o.Pos = mul(UNITY_MATRIX_MVP, v[3]); // right top <- THIS POSITION IS NOT ROTATED WITH OBJECT. WHY?
	o.UV  = p[0].UV;
	o.Len = half4(p[0].Face.xy, 1, 1);
	triStream.Append(o);
}

It turns out the problem is related to batching.
It doesn’t work when the object being viewed is batched, as batching affects the projection matrix (used in UNITY_MATRIX_MVP). The shader works as expected when the object views is the only object being viewed and there is no batching.

Or perhaps that is incorrect and what is really being affected is the concept of object space. When all objects are batched, they are all put into one ‘space’. So translating from object to projection space doesn’t make sense with multiple rotations anymore.

Anyway, batching is the issue!