# Light Direction Lookup from SH

As I understand it, the SH lighting creates a map of incoming diffuse lighting and stores the value as a color (which is then translated into SH as a means of low pass filtering.)

When reconstructing the SH, you use a world normal to fetch the right color value. At this point, however, you have no light direction, correct?

So, my question is how can you get light direction from the SH? This is important for simulating the surface scattering properties of the material.

I’ve got my own method which fits in SH2.0, but it’s rather expensive. I also know that you can use methods outlined in the Stupid Spherical Harmonics trick paper, but those are prohibitively expensive and also require multiple lookups (or even iterative lookups.)

Anyone out there doing similar work?

Besides this, I’m looking into alternative formulations for SH construction (there’s more than one way to skin a cat.) Anyone messing around with this?

I wanna the answer too,a clear ,full step of how SH built,works ,refreshed

Lulucifer - I don’t understand that sentence - try again?

//Lookup the dominant light direction using the zonal harmonic components
float4 worldN= float4(mul((float3x3)_Object2World, v.normal * unity_Scale.w), 1.0f);
float dx= dot(worldN, unity_SHAr.r);
float dy= dot(worldN, unity_SHAg.g);
float dz= dot(worldN, unity_SHAb.b);
float3 d= mul((float3x3)_World2Object, float3(dx, dy, dz));
float3 n= v.normal * unity_Scale.w;
o.dir= mul(rotation, 0.5f * (n + d));

…and so it’s done.

would you explain more , i dont understand exactly

This might help! Use this shader on a sphere and you can tweak the co-efficients to see what they do.

Shader “Spherical Harmonic” {
Properties {
_SHAr (“First Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)
_SHAg (“First Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)
_SHAb (“First Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)

_SHBr (“Second Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)
_SHBg (“Second Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)
_SHBb (“Second Order Harmonic”, Vector) = (0.0,0.0,0.0,0.0)

_SHC (“Third OrderHarmonic”, Vector) = (0.0,0.0,0.0,0.0)
_A (“Alpha”, Float) = 0.5
}

Tags { “Queue”=“Transparent” “IgnoreProjector”=“True” “RenderType”=“Transparent” }
Blend SrcAlpha OneMinusSrcAlpha
Cull Off
Lighting Off
ZWrite On

Pass{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma exclude_renderers
#pragma target 2.0

#include “UnityCG.cginc”
#include “Lighting.cginc”

uniform float4 _SHAr;
uniform float4 _SHAg;
uniform float4 _SHAb;

uniform float4 _SHBr;
uniform float4 _SHBg;
uniform float4 _SHBb;

uniform float4 _SHC;

uniform float _A;

struct appdata_t {
float4 vertex : POSITION;
fixed4 color : COLOR;
float3 normal : TEXCOORD0;
float4 tangent : TEXCOORD1;
};

struct v2f {
float4 pos : SV_POSITION;
fixed4 color : COLOR;
};

v2f vert (appdata_t v)
{
v2f o;

UNITY_DIRBASIS;

//This is the spherical harmonic lookup, it takes in the world normal and spits out the appropriate value in that direction
//It comes from appendix 10 of Peter Pike Sloan’s paper “Stupid Spherical Harmonic Tricks”
//It’s a bit more complex than the simpliest transform so it can be more efficient
//http://www.ppsloan.org/publications/StupidSH36.pdf

//It takes 12 coefficients (the numbers in the _SHA,_SHB and _SHC vectors) multiplies them against eachother to construct the harmonic
//Each coefficient represents a cosine wave (which in turn are used to describe the sphere, axis by axis)
//Each level _SHA - _SHB etc divides the sphere into two sub spheres, and each contains more detail (higher frequency data) about the final wave/sphere

//It can be thought of as sound
//Each seperate frequency is a different pitch, and the final sound is all these pieces blended together

float4 wN = v.vertex; //Here I’m using the vertex position since this is for a sphere around the origin
//It’s very similar to a fourier transform, but across the surface of a sphere
half3 x1, x2, x3;

// Linear + constant polynomial terms
x1.r = dot(_SHAr,wN);
x1.g = dot(_SHAg,wN);
x1.b = dot(_SHAb,wN);

// 4 of the quadratic polynomials
half4 vB = wN.xyzz * wN.yzzx;
x2.r = dot(_SHBr,vB);
x2.g = dot(_SHBg,vB);
x2.b = dot(_SHBb,vB);

// Final quadratic polynomial
float vC = wN.xwN.x - wN.ywN.y;
x3 = _SHC.rgb * vC;

float3 shC = x1 + x2 + x3;

//Setup the vertices
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);

//the vectors the rgba colors also work as xyzw in space (color is a space, afterall…)
//i.e. positive and negative red translates verts left and right along the x axis
//o.pos = mul(UNITY_MATRIX_MVP, float4(shC.rgb, 1.0f)); //Comment this in to see the shape of the spherical harmonic

//Set the out color
//o.color = float4(mul(shC, unity_DirBasis), _A); //What was I doing with this?!
o.color = float4(shC, _A);

return o;
}

fixed4 frag (v2f i) : COLOR
{
float4 c = float4(1.0f,1.0f,1.0f,1.0f);
return i.color;
}
ENDCG
}
}
}

Ok,my question is ,can we do ShadeSH9 in Camera’s VertexLit renderingpath or just in forwardBase or surface shader?
It seems when Pass" LightMode=Vertex ",there seems 0.0 renturned by ShadeSH9(worldN,1.0);

I baked two set lightProbe,how should I switch then at run time ,currently i refer them in script, but it does not works

`````` public LightProbes lp1;
public LightProbes lp2;
public Rect[]  rect;

void OnGUI()
{
if (GUI.Button(rect[0], "Switch to LightProbes 1"))
{
LightmapSettings.lightProbes=lp1;// = Instantiate(lp1) as LightProbes;
}
if (GUI.Button(rect[1], "Switch to LightProbes 2"))
{
LightmapSettings.lightProbes = lp2;// = Instantiate(lp1) as LightProbes;
}
}
``````