Geometry Shader is drawing triangles in wrong vertex positions

Hi,

After learning about shaders in past couple of weeks, I wanted to try something on my own.
So I tried to draw a face for a cube using Compute Shader and Geometry Shader.

Compute shader is generating 16 points ( 2x2 x 2x2) which are equally spaced by 1 unit and bottom left vertex is at origin. Below is the result without Geometry shader.

When I use Geometry Shader for generating triangles on each point. It is changing the vertices in x and y axis a little bit as you can see below. There is a gap between rows of triangles.

78259-trngs-1.png

Here is my compute and geometry shader code.

Shader "Custom/CubeShader"
{
	SubShader
	{
		Pass
	{
		CGPROGRAM
		#pragma target 5.0
		#pragma vertex vert
		#pragma geometry GS_Main
		#pragma fragment frag
		#include "UnityCG.cginc"
		StructuredBuffer<float3> square_points;
		struct ps_input {
			float4 pos : POSITION;
		};
		struct gs_input {
			float4 pos : POSITION;
		};
		ps_input vert(uint id : SV_VertexID)
		{
			ps_input o;
			float3 worldPos = square_points[id];
			o.pos = mul(UNITY_MATRIX_MVP, float4(worldPos,1.0f));
			return o;
		}
		[maxvertexcount(3)]
		void GS_Main(point gs_input p[1], inout TriangleStream<gs_input> triStream)
		{
			float4 v[3];
			v[0] = float4(p[0].pos.x, p[0].pos.y, p[0].pos.z, 1);
			v[1] = float4(p[0].pos.x + 1.0f, p[0].pos.y + 0.0f, p[0].pos.z + 0.0f, 1);
			v[2] = float4(p[0].pos.x + 1.0f, p[0].pos.y + 1.0f, p[0].pos.z + 0.0f, 1);
			float4x4 vp = mul(UNITY_MATRIX_VP, _World2Object);
			gs_input pIn;
			pIn.pos = mul(vp, v[2]);
			triStream.Append(pIn);
			pIn.pos = mul(vp, v[1]);
			triStream.Append(pIn);
			pIn.pos = mul(vp, v[0]);
			triStream.Append(pIn);
		}

		float4 frag(ps_input i) : COLOR
		{
			return float4(1,0,0,1);
		}
		ENDCG
		}
	}
	Fallback Off
}

#pragma kernel CSMain
#define thread_group_size_x 2
#define thread_group_size_y 2
#define thread_group_size_z 1
#define group_size_x 2
#define group_size_y 2
#define group_size_z 1
struct PositionStruct
{
	float3 pos;
};
RWStructuredBuffer<PositionStruct> output;

// I know this function is unnecessary. But only for now :) I would like to add more things here.
float3 GetPosition(float3 p, int idx)
{
	return p;
}
[numthreads(thread_group_size_x, thread_group_size_y, thread_group_size_z)]
void CSMain(uint3 id : SV_DispatchThreadID)
{
	int idx = id.x + (id.y * thread_group_size_x * group_size_x) +
		(id.z * thread_group_size_x * group_size_y * group_size_z);
	float3 pos = float3(id.x, id.y, id.z);
	pos = GetPosition(pos, idx);
	output[idx].pos = pos;
}

Please help me debug it. Thanks in advance :slight_smile:

I’m actually suprised that you can see any triangles ^^. The problem is most likely that you use the MVP matrix in your vertex shader. This will transform the incoming vertices directly from local space to screenspace. However inside your geometry shader you use World2Object which of course won’t work since the incoming positions are in screenspace and not worldspace.

You might want to only use “_Object2World” inside your vertex shader.

However keep in mind when your gameobject is scaled or rotated you will again get the wrong result. The positions that you have generated are in local space one unit apart. However when you transform them into worldspace they will go through the scale / rotation and translation of your gameobject to reach their worldspace position. If you now feeding in worldspace coordinates into your geom. shader you actually add “1” in worldspace. So when the object is rotated or scaled you will get the wrong positions.

The actual solution here would be to not transform your vertices at all inside the vertex shader, You can simply pass along the local space coordinates. Inside the geometry shader you would then create the other coordinates for a triangle also in local space. Finally you would use the MVP to transform them into their final screen space position.

Btw: you should be able to do something like that:

v[0] = v[1] = v[2] = float4(p[0].pos.xyz, 1.0f);
v[1].xy += float2(1.0f, 0.0f);
v[2].xy += float2(1.0f, 1.0f);

Not sure if the syntax is right ^^. Also keep in mind that Unity works clockwise by default. You arranged v1 / 2 / 3 counter clockwise and then emit them in reverse order which makes them clockwise again. It’s just a bit harder to read / understand.

Ohh and the so called “swizzle” operators allow much more crazy stuff like:

float4 a = pos.zxxy;

This will be the same as:

float4 a = float4(pos.z, pos.x, pos.x, pos.y);

See the Nvidia Cg documentation

@srimanth_d @Bunny83 I tried to compile this code (with the corrected version) but it just gives syntax errors regardless of whether I used file extensions of “,shader” or “.compute”. I can’t find any explanation in the manual for using a shader that combines a compute, geometry, vertex and fragment shader.