I have C# code that generates a pair of procedural meshes which are, by design, very low-poly. There can be a lot of instances of these pairs in a scene, and the two meshes always use the same Material but with different per-GameObject parameters passed to the shader (which is a custom shader I’ve created) via a MaterialPropertyBlock.
To improve performance, I procedurally combine these two meshes during initialization or any time the input meshes change at runtime (which is rare but possible). I create a single combined mesh with two submeshes, and I assign the same Material to each submesh. I can’t (currently) allow the submeshes to be merged, as I’ll explain below.
In order to have a separate MaterialPropertyBlock for each submesh, I’ve found that I have to treat my single Material as if it was two different ones, i.e. the following (approximate; I don’t have my source in front of me right now):
myMeshRenderer.sharedMaterials = new Material[] { myMaterial, myMaterial };
This is because the Renderer.SetPropertyBlock() method’s second parameter is the material index integer – in other words, there isn’t a way within the mesh to indicate which Material should apply to particular triangles.
All of the above works perfectly, but I’m getting more SetPass() calls than I’d like. The shader already has two passes because I need a depth-only pass (I’m working to eliminate that requirement, but that’s a different topic). With two submeshes, I now get four SetPass() calls per procedural object. If I can combine the submeshes fully, that would reduce from four to two SetPass() calls, and if I can further solve the depth pass problem (which I believe I can), then I’d be down to just one SetPass().
My custom shader receives (among other parameters) two Color values, one for the “primary color” and one for the “secondary color”. Every triangle is rendered using these two colors only, but the details of which color goes where (within the triangle) are exactly opposite between the two submeshes. Other parameters govern some additional cosmetics, but those other parameters are common to both submeshes. In other words, the fragment portion of the shader could be given the same primary and secondary colors for both submeshes – as long as there is a way for it to determine which submesh it is currently rendering.
If I could do that, then I wouldn’t even need two submeshes – I could combine the meshes fully, have just one Material and one MaterialPropertyBlock (because the colors are still local to each GameObject).
To work around this, the only method I’ve come up with so far is to stash some extra data in an otherwise-unused data channel from the mesh. My custom shader is unlit and textureless, so the UVs are not actually used nor will be in the future, so I am considering using one of the higher-numbered UV channels. Alternatively, I could use the vertex colors. In either case, the extra data per vertex would simply be the equivalent of “which submesh index contains this vertex?”
If you’re still with me after all of this lengthy context, thank you! Here, finally, are my questions:
- I haven’t found such a thing yet in the documentation, but is there an existing helper function in the shader language that I can use to obtain the integer submesh index? If I can do that, problem solved, and I don’t need to stuff any additional data into the mesh.
- Failing #1, is there a helper function for the fragment shader to obtain the triangle number (that is, the subscript of the current triangle in the Mesh.triangles array)? Again, if I can interrogate that data, I can solve the problem easily.
- Is it a really terrible idea to put values into the secondary UV channel, or the vertex color channel, that are interpreted by my custom shader in unusual ways? It occurs to me that having weird values in a UV channel might cause a problem if somehow my Material got switched to a different shader, but in theory any arbitrary RGBA data is allowed in a color channel and at worst might cause some really ugly colors but not actually break anything. So I’m thinking that using a color channel would be safer, if I really must go down this road.
- Does anyone have an alternative method I might not have considered?
- Am I more worried about the number of SetPass() calls than I should be? In particular, just how much real-world performance impact will I see from a depth-only pass?
I suspect #1 and #2 are not viable, because the member properties of the Mesh class are specific to Unity and not necessarily directly corresponding to entities known to the shader program.
With respect to #3, it’s extremely unlikely my Material would get assigned to the wrong shader, because I create the Material (a shared static member) procedurally during initialization.
Any suggestions or comments welcome. Thanks!