We use a vertex shader to offset geometry based on distance, to create the path bending effect you see in most 3D endless runners. I’m getting two different offset results depending on whether the shader is applied to mesh geometry, or a particle system, and I can’t seem to figure out why.
Here’s the base vertex function:
void vert (inout appdata_full v)
{
float4 vPos = mul (_Object2World, v.vertex);
float3 camRelativePos = _WorldSpaceCameraPos - vPos;
float zOff = camRelativePos.z / _Dist;
vPos += _QOffset*zOff*zOff;
vPos = mul( _World2Object, vPos);
v.vertex = vPos;
v.vertex.xyz *= unity_Scale.w;
}
_QOffset is a global vector which specifies the base amount of X and Y offset to apply. _Dist is a global float which specifies the distance from the camera over which _QOffset will be ramped in. The closer to the camera the vertex is, the less _QOffset will contribute to it.
When I applied that vertex function to a particle system using the Stretched Billboard renderer, initially the particles just stopped rendering entirely. I had to tweak the particle shader to get them to show up again. Here’s that version’s vertex function:
vertexOutput vert(vertexInput input)
{
vertexOutput output;
float4 vPos = mul (_Object2World, input.vertex);
float3 camRelativePos = _WorldSpaceCameraPos - vPos;
float zOff = camRelativePos.z / _Dist;
vPos += _QOffset*zOff*zOff;
vPos = mul( _World2Object, vPos);
output.pos = vPos;
output.pos.xyz *= unity_Scale.w;
output.pos = mul(UNITY_MATRIX_MVP, output.pos); // <-- THIS IS NEW
output.texc = input.texcoord.xy;
output.color = input.color * _Color;
return output;
}
There’s a little bit of other different stuff going on here in terms of the return value and texture coordinates and what-not, which is because the particle shader is using a fragment function while the original shader is using a surface function. As you can see, the actual vertex transformation code is identical, save for the matrix multiply that I called out.
With that modification the particles show up again, but they’re not quite in the right position. I have a mesh (with the first shader) and a particle system (with the second shader) placed at the exact same position as each other. When they’re close to the camera, everything looks okay. The further from the camera they get, the more they appear to diverge from one another.
I’m sure this is something simple and stupid, but I barely understand shaders in the first place. For example, I don’t understand why I needed to add that matrix multiply to get the particles to show up at all. And obviously I don’t understand why either way I’m getting two totally different results depending on whether I’m rendering geometry or particles.
Any insight would be greatly appreciated! <3