Hello there,
Recently I’ve had an issue where a procedural mesh I create isn’t correctly changing color based off of a color array provided. I current have
for (int i = 0; i < GetComponent<MeshFilter>().sharedMesh.vertices.Length; i++)
{
newColors.Add(Color.red);
}
this.GetComponent<MeshFilter>().sharedMesh.colors = newColors.ToArray();
which assigns Color.red to every vertex in the procedural mesh. However, even though it registers that it now has colors assigned the mesh remains gray. The ShaderGraph I’m using is
which should just set the albedo of the material to the vertex color. To troubleshoot I tried making another example mesh that would just create a triangle with each color set to red (in the same way as the code above), and it was set fine. It might be worth mentioning that neither have any UVs set, as I don’t plan on applying a texture, only vertex colors. Would anyone happen to know what I’m doing wrong here? I can’t seem to pinpoint the reason as to why it would show on one procedural mesh and not the other.
Other images:
Procedural mesh that remains gray
Triangle example that uses the same material (and works):
If you need any more details I’ll be happy to provide them.
Maybe the mesh has 32 bit colors and the shader graph prioritizes those? Sorry I’m not super familiar with how the shader graph works, your code snippet looks OK to me though. Try setting colors32 on the mesh instead and see if that works.
Already tried using 32 bit colors instead, no dice. I don’t see how there is any difference whatsoever between the triangle I generated and the actual procedural mesh.
well, not sure, maybe the geo construction? perhaps normals, W_tangets, smoothing, or culling?
does the mesh behave “inside-out” bigger than a triangle? does it triangulate correctly? does it just sht-the-bed went sending v2frag?
perhaps try adding a UV channel, see if something changes.
procedural quad? procedural pentagon? are those results the same
try giving that PBR shader some metal/shiny/emissive and a light, is the geo even working as expected?
does the vertex color work if it is even more basic shader? try a simple unlit shader
Normals are calculated using Mesh#RecalculateNormals, and other data is filled out as shown below:
I’ve tried calculating the UVs, nothing changed. I don’t believe it’s the material itself because when applying other standard materials none of them would show up either on the mesh.I’m not exactly sure what you mean by “does it just sht-the-bed went sending v2frag?”, and the it does indeed triangulate correctly, the wireframe view seems fine. I got the same results with an unlit shader, and switching to either quads or pentagons would break my current system of generation.
However, it would seem they were using the built in rendering pipeline to allow this change, and as far as I know LWRP currently doesn’t have any support for deferred rendering. Is there a different solution or fix to this?
I initially started the project with the URP template so I’m assuming all of that setup was already done beforehand. I can’t exactly just apply the material to any primitive object because it would just show up gray (it uses the vertex color as the albedo, which is provided via script). The triangle mesh I created however did appear to stay red no matter what angle I looked at it, so I’m at a bit of a loss.
After doing this the procedural mesh still remained gray, and the other procedural triangle example changed properly to that albedo color. The effect of the color only showing up when being very close to the mesh at a certain angle remains for the large procedural mesh. I think it’s more of a problem with the camera/rendering settings than the actual shader properties.