Sampling Lightmap in Surface Shader

I just want to sample the lightmap in a surface shader, but it complains if I don’t declare unity_Lightmap, but if I do declare it, Unity complains about being declared twice!

I’ve searched a lot, but found very little help/discussion/examples around the forums about sampling Unity’s Beast lightmaps. This is about the most useful thread I’ve found: http://forum.unity3d.com/threads/135495-Sampling-Lightmap-in-a-Shader and I’ve seen others use unity_Lightmap and unity_LightmapST in custom vertex/fragment shaders, but not surface shaders.

Anyway, If I define:

sampler2D unity_Lightmap;
float4 unity_LightmapST;

I get Program ‘frag_surf’, declaration of “unity_Lightmap” conflicts with previous declaration

and if I don’t define it and just try to sample it (using any uv coords)

DecodeLightmap ( tex2D ( unity_Lightmap, IN.uv_MainTex) );

I get Program ‘SurfShaderInternalFunc’, undefined variable “unity_Lightmap”

I don’t understand how that’s even possible. It seems to be defined by Unity internally but not yet defined in my surface function?

Maybe you need the #ifdefine #else pragma, try #pragma debug,look into the code how unity make it

Well, if I wrap the code trying to use unity_Lightmap sampler in #ifdef FRAGMENT directives ​inside the surf function it doesn’t give me errors, but it also doesn’t work because FRAGMENT is never defined for it, if I try and declare them outside the surf function I’m back where I was before.

Here’s the full shader if anyone wants to take a crack at it, I just want to modulate the emissive rim so it doesn’t shine in areas shadowed by the lightmap.

Shader "Custom/Toon Rim" {    Properties {
        _MainTex ("Texture", 2D) = "white" {}
        _Ramp ("Shading Ramp", 2D) = "gray" {}
        _RimColor ("Rim Color", Color) = (0.26,0.19,0.16,0.0)
        _RimPower ("Rim Power", Range(0.1,8.0)) = 3.0      
    }
    
    SubShader {
    
        Tags { "RenderType" = "Opaque" }


        CGPROGRAM
        //#prgama debug
        #pragma surface surf Ramp
    
        sampler2D _Ramp;
        sampler2D _MainTex;
        //sampler2D unity_Lightmap;
        //float4 unity_LightmapST;
        float4 _Tint;
        float4 _RimColor;
        float _RimPower;      


        half4 LightingRamp (SurfaceOutput s, half3 lightDir, half atten) {
            half NdotL = dot (s.Normal, lightDir);
            half diff = NdotL * 0.5 + 0.5;
            half3 ramp = tex2D (_Ramp, float2(diff)).rgb;
            half4 c;
            half3 light = _LightColor0.rgb * ramp * (atten * 2);
            c.rgb = s.Albedo * light;
            c.a = 0;
            return c;
        }
        
        struct Input {
            float2 uv_MainTex;
            float3 viewDir;
        };
        
        void surf (Input IN, inout SurfaceOutput o) {
            //fixed3 lightmap = DecodeLightmap ( tex2D ( unity_Lightmap, IN.uv_MainTex) );
            o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
            half rim = 1.0 - saturate(dot (normalize(IN.viewDir), o.Normal));
            o.Emission = _RimColor.rgb * pow (rim, _RimPower);
        }
        ENDCG
    }


    Fallback "Diffuse"
}

Feels like every time I try to use Unity’s surface shaders I hit odd walls like this and end up having to redo everything in a fragment/vertex shader. I know Aras added them to save time and make things easier, but all they seem to have done for me over the past few years is obscure what’s actually happening and waste a lot of time, seems a bit backwards to me.

fixed4 instead of float4?
that is what i am using

No, it’s not that. The surface shaders seem to be done during the forward add pass, but the lightmaps are only declared and used in the forward base pass (where the lighting function is also used). It’s super confusing that we write surface shaders as if they’re a single pass with shared state, but when it’s compiled unity breaks it up and we have different environment state for each function because they’re now in separate passes.

I’m thinking this is actually a bug, if Unity is going to try and be clever and automagically split up the code into separate passes for surface/lighting-model shaders then it should also be smart enough to not complain about duplicate definitions in one state/pass that don’t exist in another state/pass, otherwise we end up in odd situation like this.

Unless there’s some easy way around this I’m not aware of?

Hey @Cameron
Did you get anywhere with this? Just trying to achieve the same thing here.

After much stochastic fiddling, I’m suspecting the only answer is to write vert and frag shaders (instead of surface shader). Perhaps copying the generated output from an original where I’d assigned the lightmap as an actual texture to the shader! (which produced working result and simply calculated and ignored my own lightmap stuff but still bound the lightmap texture an extra time)

Cheers, Rupert.

p.s. I just came across http://cheese.barkingmousestudio.com/post/54880102337/using-unity-lightmap-data-in-shaders which suggests it can’t be a surface shader. Confirmation from Unity welcome :slight_smile: ?

Anyone?

It’s probably not very easy to do. Using light maps bypasses any real-time lighting (and therefore defeats the purpose of the surface shader).

LightShape has a surface shader that uses lightmaps. i use the lightmap for spec occlusion. maybe there’s something in there u can use?

theres a DL link near bottom of the page to a file that has the shaders in it.
Note: I have not tried it in >4.1 Unity yet. been buzy

http://cherubartist.com/lightshape/about-the-lightshape-shaders/