Before I come to my specific questions, I will first describe what I’m going for in case there is a better approach: for a stylized “low-poly” aesthetic, I currently use Unity’s builtin Legacy Vertex Lit rendering path and a “LightMode” = “Vertex” shader in which I calculate the lighting manually in the vertex shader to put into nointerpolation fields and sample those in the fragment shader. This works and looks well giving a nice facetted look, however I would like to add baked shadows (per fragment of course). For convenience and since experiments with baking shadows to a texture in Blender were dissatisfying, I would like to (ab)use Unity Lightmaps, specifically Subtractive ones. I still want to calculate lighting at runtime and only extract shadow and ambient occlusion information from the Lightmap (don’t care about bounce lighting), which is why I want to average the RGB channels of the Lightmap together to get sort of a colorless “brightness value” which I can multiply my albedo with.
To do all that, I changed my shader’s LightMode from Vertex to VertexLM so I can sample Lightmaps. That part works, but I can no longer access the brightest/most important light source anymore (using unity_LightColor[0] - e.g. when I have three directional lights only the first two array entries are populated with non-black colors). I believe this is because according to Unity - Manual: Forward rendering path the “brightest directional light is always per-pixel”, which I assume in this case of Legacy Vertex Lit / VertexLM means it isn’t passed to the shader via the unity_Light… variables and expected to be handled via the Lightmap.
Is there an easy way to get access to that light source in this setup?
My current dubious workaround is having an additional extremely bright light set to important (to ensure it is ranked as the most important) with a Culling Mask set to only affect the objects using the special shader. This dummy light source is then ignored at runtime and I can access the “real” most important light again.