# Fake Directional Lights for Deferred Surface Shader

I’m looking for ways to optimize the rendering in my game and one of the big issues is due to the fact that I’m using deferred rendering along with several directional light sources.

I was expecting it to be a fairly straightforward effort to get ‘fake’ directional lights working by simply passing a few vectors and colors into the shader as global variables, and for the most part I have it working. However, it’s not quite working the way I expected and I’m not familiar enough with the mathematics involved to understand what is happening.

I have a simple set of functions for calculating the fake lights:

``````                float4 GlobalLight_0;
float4 GlobalLight_1;
float4 GlobalLight_2;

float3 GlobalLight_0_Dir;
float3 GlobalLight_1_Dir;
float3 GlobalLight_2_Dir;

half3 GlobalLight0(half3 norm, half3 diffuse)
{
//alpha stores the light's intensity
half nl = max(0, dot(norm.xyz, -GlobalLight_0_Dir));
return diffuse * (GlobalLight_0.rgb * GlobalLight_0.a * nl);
}

half3 GlobalLight1(half3 norm, half3 diffuse)
{
//alpha stores the light's intensity
half nl = max(0, dot(norm.xyz, -GlobalLight_1_Dir));
return diffuse * (GlobalLight_1.rgb * GlobalLight_1.a * nl);
}

half3 GlobalLight2(half3 norm, half3 diffuse)
{
//alpha stores the light's intensity
half nl = max(0, dot(norm.xyz, -GlobalLight_2_Dir));
return diffuse * (GlobalLight_2.rgb * GlobalLight_2.a * nl);
}
``````

My initial thought was to add this to the Albedo output but when I tried that the tint and intensity were way off. On a hunch, I tried using the Emission value of the output instead and the results turned out pretty close to the default look.

``````void surf (Input IN, inout SurfaceOutputStandard o)
{
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;

// Metallic and smoothness come from slider variables
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a;

//no good
//o.Albedo += GlobalLight0(o.Normal, c.rgb) + GlobalLight1(o.Normal, c.rgb) + GlobalLight2(o.Normal, c.rgb);

//this works pretty well
o.Emission = GlobalLight0(o.Normal, c.rgb) + GlobalLight1(o.Normal, c.rgb) + GlobalLight2(o.Normal, c.rgb);
}
``````

So my questions are:

A) what would be the technically most correct way to accomplish the lighting effect I’m going for? And am I going to find I’ve shot myself in the foot if I use the emission property as a way of applying lights?

B) I’ve noticed that the colors are a tad bit off in some cases when compared to an actual light with the same intensity and color. As well, the fake lights tend to be more intense than the real lights at low-intensity values and less intense than real lights at higher intensities. I can only assume this is due to Unity using a more complicated lighting calculation, most likely involving PBR?

A - it’s fine. This is how ambient lighting works in deferred already. It might be more efficient to do this with a custom deferred multi-light using a command buffer, or modifying the built in deferred lighting shader.

B - because you’re probably not calculating the intensity the same. By default Unity’s lights color and intensities are in gamma space. If you pass a “color” vector value via an array, Unity doesn’t apply any of the usual color space correction it does to single color values, so you have to do it yourself.

Vector4 lightColorVector = (light.color * light.intensity).linear;

1 Like

My initial thought was to embed this into the already modified deferred light shader I’m using, however it appears that it only applies to the already calculated light volumes - which is what I’m trying to avoid in the first place. Is there another way to globally apply this effect in a custom deferred shader? It would certainly be nice if I could globally apply these fake lights to any shader rather than having to use a specific shader for each material that needs it.