I’ve been struggling at trying to make sense of including SH calculations inside a custom lighting model for our project.
First, let me explain what is going on. As a disclaimer, I’m pretty much a beginner when it comes to shaders and most of the logic is being done in ShaderForge (we plan on rewriting the shaders later on, but that should do for now. Besides, I don’t believe this is an asset related question, so please bear with me)
So, we’re working on a project which is trying to emulate a flat-ish lighting style, and after some tweaks, got a custom lighting model we were happy with. However, I found out later that light probe contribution would need to be added manually in the shader, so I did some research and found about the ShadeSH9 function in UnityCGIncludes (page 44), which you can also read about in the Unity Docs.
I did get it to work on a test shader in shaderforge by adding a custom code with the object normal direction as input (image taken by zaza)
This definitely was what I was looking for! However, the light probe contribution kinda works in a lambertian-ish way, so our custom lighting would get ruined if we simply added that result on top of our current lighting calculation
Worst thing is, from what it looks like, the ShaderSH9 gives a float3 (RGB) result already, meaning I can’t split light and color information into 2 separate passes, use the light contribution as a float1 value, mix it with the other tweaks and multiply by that light probe color contribution in the end. At least, I have no idea how to do that.
Can anyone help me solve this dilemma? I would love to use light probes in our custom lighting since it would be a pain to bake the lightmaps, set the lights back to realtime, cull them so they only light the character and redo everything everytime we need a new bake.
