2 (or more) Layer Shader Combining Diffuse, Ambient, Spec...

Today, I started messing around with ShaderLab in order to try to cook up some shaders I need and it’s all a bit overwhelming. I’m willing to mess around and try to learn, but I need some help figuring out if something is possible and how I would start going about it.

What I want to do is something like the TerrainTwoLayer shader on the UnifyCommunity wiki, but blending between all the attributes of a typical shader (diffuse color, ambient color, specular color, emmision color, shinniness, and a texture (with transparency based on alpha)). I was looking at using the “combine” operator with the “Material” or “UsePass” constructs, but I’m pretty sure that’s not possible.

Eventually I want to do bump-mapping and 3 or even 4 layers, but those aren’t that important right now.

From my understanding of OpenGL shading, I’m thinking that the main lighting pass is going to have to do nothing and the texturing pass will have to re-do what the lighting pass would have done based on the layer mask. Either that or the lighting pass will have to do everything full and the texturing pass will use the results of that to different levels. But it’s probably not even possible without a whole lot of custom Cg code.

Any help will be greatly appreciated.

Ok, so what you want now would be this (just to check whether I understood it correctly):

A VertexLit shader with two Main Colors, two Spec Colors, two Emissive Colors, two Shininess values, and two textures; and interpolate between all pairs based on first texture’s alpha (or based on some additional texture?).

Now, the “proper” way of doing that - actually making the specular color be blended between two before all calculations are done, etc. - is not really possible (in a VertexLit shader). Calculations are done at the vertices, and you’d need to get texture’s alpha value before the calculations!

What could work though is compute all lighting with first set of parameters; compute all lighting with second set and then interpolate between the final results based on texture’s alpha. This could be easily done in two passes: one uses first set of parameters and multiplies result with alpha; another pass uses second set of parameters, multiplies result with (1-alpha) and adds onto whatever first pass has rendered. I’d imagine Blend commands would be best to use in both passes.

Yes, this sounds exactly like what I need. I seem to remember a thing or two about Blending mode from doing OpenGL/GLUT programming, so I’ll take a crack at it. I guess if I can’t figure it out at the moment, I can probably use multiple materials on the object, despite the rendering cost. Thanks.

Actually, if you can do with two materials in this case (i.e. you don’t have to smootly blend between twe two), go for it. Rendering two materials on a single object or rendering one object in two passes is almost the same cost. First is a bit higher on the CPU, second one may be higher on the GPU.

I definitely need to smoothly blend between the materials; I thought that was possible using multiple materials. Wouldn’t just setting the alpha on the main texture of most materials (“Alpha/Glossy”, for example) allow that second material to show only as much as it’s alpha states? Thanks.

Using Alpha materials is meant for partially transparent objects. So yes, they do show only the amount of material that alpha channel tells them to, but they also show the background as “the other part”.