Degenerate triangles are fine, and are indeed skipped by the GPU for rasterization (ie, the fragment shader won’t run on them), but all other shader stages still will. Also a vertex shader has no knowledge of triangles, only of individual vertices. Each vertex may be used by one or multiple triangles. It’s possible if you’re setting other non-zero vertices of a triangle to also be zero, you’re affecting a shared vertex and what you’re seeing is those triangles stretching and the one you intended to hide is indeed being hidden.

If you want to skip triangles that have one vertex that is “bad”, then you can modify the mesh from script, or use a geometry shader or …

… that. Any triangle with a NaN vertex position will not be rendered, just like a degenerate triangle.

So, how do you set a NaN? You can’t just do `o.pos = NaN;`

, there’s no NaN constant in HLSL or GLSL. However you can create it with `0.0/0.0`

or `sqrt(-1)`

.

```
v2f vert (appdata v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
if (v.vertex.xyz == float3(0,0,0))
o.pos.w = 0.0 / 0.0;
}
```

Both will produce warnings on your shader though. These can be safely ignored, though it should be noted that some platforms may optimize the NaNs away (old mobile GPUs mainly at this point). If the warnings or old GLES 2.0 platforms are a problem for you, use a c# script to set a global NaN shader value and use that instead.

c#

```
Shader.SetGlobalFloat("_NaN", System.Single.NaN);
```

hlsl

```
float _NaN;
v2f vert(appdata v)
{
v2f o;
o.pos = UnityObjectToClipPos(v.vertex);
if (v.vertex.xyz == float3(0,0,0))
o.pos.w = _NaN;
}
```