In my custom CG code, how can I get scene’s fog color (the one set in render settings) and grayscale fog mask, calculated according to the selected “fog mode”?
I’m writing additive-multiply shader. So far, I didn’t find a way to make native ShaderLab’s fog work correctly (the one that’s specified via Fog statement).
So I’d like to disable it completely and manually apply it in my CG code.
AFAIK there are no built-in shader variables that get the fog color like the ambient color, you have to feed it to the shader yourself using setColor, even the globalfog screen effect from the pro packages does this
you can find the color in script via RenderSettings.fogColor
EDIT: WRONG! read comment below
It’s only mentioned in passing, but it’s in the documentation:
uniform half4 unity_FogColor;
2 Likes
Thanks guys.
There’s also unity_FogColor and unity_FogStart in the same example. Is there any explanation about these two variables? Why are they half4? “start” and “end” variables should be simple scalar, shouldn’t they?
There’s no mention about these variables in any of default cginc files.
I think that’s the only place they’re mentioned in the manual. My guess is that they are actually scalar values, which is why only the first component gets used in that example. You could test the other ones yourself pretty easily.
Finally, I got time to test these variables.
I found the definition (formulas) of the Fog modes. And successfully guessed the variable which stores “Density” value for “Exponent” mode.
But as soon as I wrote the actual code, the shader stopped to work. For some reason, it fails to compile if I use HLSL’s exp() function. Could anyone help, please?
I did it!
If someone would face the same task in the future, here’s the code of the shader implementing Exp2 fog:
Shader "DRL/testFog-My" {
Properties {
_MainTex ("Base (RGB)", 2D) = "black" {}
}
SubShader {
Tags { "RenderType" = "Opaque" }
Cull off
Lighting Off
Blend SrcAlpha OneMinusSrcAlpha
Fog {Mode Off}
Pass {
CGPROGRAM
#pragma target 2.0
#pragma vertex vert
#pragma fragment frag
//#pragma exclude_renderers d3d11 xbox360 ps3 flash d3d11_9x
#include "UnityCG.cginc"
uniform fixed3 unity_FogColor;
uniform half unity_FogDensity;
sampler2D _MainTex;
struct vertexInput {
float4 vertex : POSITION;
float2 texcoord : TEXCOORD0;
};
struct v2f {
float4 scrPos : SV_POSITION;
half2 srcUVs: TEXCOORD0;
half2 fogDepth: TEXCOORD1; // linear depth in x and depth multiplied by density in y
};
v2f vert (vertexInput v)
{
v2f o;
o.scrPos = mul(UNITY_MATRIX_MVP, v.vertex);
//o.srcUVs = TRANSFORM_TEX(v.texcoord, _MainTex);
o.srcUVs = v.texcoord;
o.fogDepth.x = length(mul (UNITY_MATRIX_MV, v.vertex).xyz);
o.fogDepth.y = o.fogDepth.x * unity_FogDensity;
return o;
}
fixed4 frag (v2f i) : COLOR
{
fixed3 clr = tex2D(_MainTex, i.srcUVs).rgb;
// Exp2 mode:
float fogAmt = i.fogDepth.y * i.fogDepth.y;
fogAmt = exp(-fogAmt);
clr = lerp(unity_FogColor, clr, fogAmt);
return fixed4(
clr,
1.0
);
}
ENDCG
}
}
}
I need to mention, that unity_FogColor is declared as fixed3, not fixed4. This is done on purpose, to make it slightly more efficient, since this shader is opaque. Keep it in mind if you’re going to use it in transparent shader.
Also, the above trouble is still there. For some reason, the shader fails to compile if I replace the last return statement with:
return fixed4(
fogAmt,
fogAmt,
fogAmt,
1.0
);
So “debug code” (visualising the fog itself) won’t work. But the actual code works and it’s fine for me.
1 Like