Why (atten * 2) ?

Just curious as to why the surface shaders all multiply the result by (atten * 2) for shadows/light attenuation?

atten is already a 0-1 range, multiplying it by 2 just means that everything not in shadow gets twice as bright. Net result is that light brightness is only really useful up to 0.5 or stuff starts to get really bleached.

What’s the logic there?

Unity doesn’t do HDR yet.

http://www.quaddicted.com/software-vs-glquake/software-vs-glquake-overbright-lighting/

This is not unique to surface shaders; Unity has done this from the beginning, except for non-Beast lightmapped shaders, for who knows what reason.

I see, so it is actually there to fake bleaching/overexposure.

Fair enough, I was just wondering :slight_smile:

I thought this was for the converstion from gamma to linear light.

No, this has nothing to do with linear lighting. It’s just a multiplication by two so that a single light can actually brighten an object beyond its albedo colour. This factor predates the addition of light intensity (up to 8).

Linear lighting is achieved primarily by using gamma-correct reads and writes for source textures, frame and light buffers.

No, that’d be pow(color, gamma) then pow(color, 1/gamma) :slight_smile:

That’s the function, yes, but it’s built right into the hardware read/write operations. This is why it’s called sRGB sampling, and why you don’t have to write custom shaders.

Ok make sense, but it’s not just a multiply by 2 :

float lengthSq = dot(i.lightDir, i.lightDir);
float atten = 1.0 / (1.0 + lengthSq * unity_LightAtten[0].z);

Wow, someone brought a thread up from the dead!

Short answer to “why multiply by two?” - because in the EarlyDays, it was a cheap way to “fake” somewhat overbright light in fixed function shaders. And then it stuck, and it kind of dragged along.

We’d like to kill this concept. But that would mean breaking possibly a lot of existing shaders that all of you wrote (suddenly everything would become twice as bright). So yeah, a tough one… worth killing an old stupid concept or not?

worth:smile:

Yeah, I’d say go for it.

TheMetaMorph; That’s the code for calculating the attenuation of a point light. The atten variable already has the light’s attenuation in it, multiplied by the shadow value for that pixel.

You mean it would become half as bright as I understand (removing *2 on attan)? Then do it, I hate this behaviour of overlit objects.
I don’t understand the concept anyway, you can set light’s intensity on 2 and get the same result.

the situation just exist for shaders compiled for forward rendering path,not deferred lighting path,so …

Uh, no, deferred has it too. Otherwise the render paths would look massively different…

really.but i cant find any code related to atten that *2 in deferred related code,
nor Internal-PrepassLighting.shader, nor Prepassbase or PrePassFinal code generated by Surface shader

For deferred lighting we multiply *2 into the light color constant.

Yeah go for it please !

Farfarer : ok :slight_smile: .

For consistency I’d agree to kill it, but I fear this should have been done for a major Unity release (i.e. 4.0) doing so in a dot release may be a bit too annoying, unless it can be achieved easily, without overhead via some compile directive at build time?

On a similar note its always bugged me that on I think its gui textures, that either the rgb or alpha, or maybe both, has ‘normal’ level at (127,127,127,127), never did understand why.

It’s easy to fix in custom shaders, so yeah I’d say dooooooo itttttttt!

“easy” is a very relative term. For someone who knows shaders, “everything two bright, remove *2 in your lighting code” – they go, find *2 and remove it, done. For someone who has no clue what a shader is, but grabbed some shaders from friends/internet/assetstore, that might be more challenging.

Very similar reason, to be able to “overbright” the UI. Tint is put into a vertex color, which is clamped to 0…1 range. So by making 0.5 be the “neutral”, you can overbright it.