Unity 5 Beta 13 Procedural Skybox Shader

Hi there,

Does anybody know how can we change the color of the sky with the new procedural Skybox Shader?

The new scattering effect is cool, but now we are stuck with a plain blue sky. I looked at the shader code, but could not find any color value in it other than “front color” which is set in a for loop I don’t understand.

What if we want to create a new planet that has a red sky? Is it possible with with the new shader? The one from Beta 9 supported custom colors.

you can modify the wavelengths value to change the sky color

        // RGB wavelengths
        #define WR 0.68
        #define WG 0.55
        #define WB 0.44

Where do you put the defines? The 13 sky is a lot darker than the 12 sky. I’d like to lighten it up to look closer to 12.

I’ve noticed too that Beta13 changed the procedural sky material. The new one feels rather limited, unfortunately.

You can still change the material for the whole scene, though. Under the lighting window go to the Scene Tab and there from environment lighting select gradient. This is more like the previous sky material but directly inside the lighting window.

Thanks, I saw that earlier. That doesn’t affect the brightness of the actual sky though. 12 was a really brilliant, bright sky which looked great to me. Now it’s just too dark up high, almost like twilight.

The actuall brightness of the sky is set below with the “Ambient Intensity” slider.
It’s also a good idea to have a directional light source in the scene which sets the sun on the sky.

All those settings do is change the lighting of the scenery, they don’t change the blue sky color at all that I can see.

It seems you are correct. Please file a bug report if you think this is the wrong behaviour.
I agree that this is not optimal at the moment.

Download the built-in shaders, find the shader Skybox-Procedural. Copy it in your project, edit it and change the name (in the shader, not the file itself, but you can do that as well if you want) and remove procedural/ in front (not sure why, but it wouldn’t show up in the list if I left it).

Make a new material, apply your edited shader to it, set that material as a skybox.

Now in the shader file, you can tweak all the defines (including the ones Luckymouse told) and put values that you like.

There are interesting too:

#define kRAYLEIGH 0.0025 // Rayleigh constant
#define kMIE 0.0010 // Mie constant
#define kSUN_BRIGHTNESS 20.0 // Sun brightness

I think the new procedural is pretty great. The implementation is kinda basic, but it looks ok, you can edit the shader to improve it, and with the new skybox model, the gradient is done per vertex, which means it’s a lot faster than a per pixel skybox.

2 Likes

This might help

Shader "Skybox/ProceduralUpgrade" {
Properties {
    _HdrExposure("HDR Exposure", Float) = 1.3
    _GroundColor ("Ground", Color) = (.369, .349, .341, 1)
    _RL("Rayleigh", Float) = 0.0025
    _MIE ("MIE", Float) = 0.0010
    _SUN("Sun brightness", Float) = 20.0

}

SubShader {
    Tags { "Queue"="Background" "RenderType"="Background" "PreviewType"="Skybox" }
    Cull Off ZWrite Off

    Pass {
       
        CGPROGRAM
        #pragma vertex vert
        #pragma fragment frag

        #include "UnityCG.cginc"
        #include "Lighting.cginc"


        uniform half _HdrExposure,_RL,_MIE,_SUN;        // HDR exposure
        uniform half3 _GroundColor;


        // RGB wavelengths
        #define WR 0.65
        #define WG 0.57
        #define WB 0.475
        static const float3 kInvWavelength = float3(1.0 / (WR*WR*WR*WR), 1.0 / (WG*WG*WG*WG), 1.0 / (WB*WB*WB*WB));
        #define OUTER_RADIUS 1.025
        static const float kOuterRadius = OUTER_RADIUS;
        static const float kOuterRadius2 = OUTER_RADIUS*OUTER_RADIUS;
        static const float kInnerRadius = 1.0;
        static const float kInnerRadius2 = 1.0;

        static const float kCameraHeight = 0.0001;

        //#define kRAYLEIGH 0.0025        // Rayleigh constant
        //#define kMIE 0.0010              // Mie constant
        //#define kSUN_BRIGHTNESS 20.0     // Sun brightness
        #define kRAYLEIGH _RL        // Rayleigh constant
        #define kMIE _MIE             // Mie constant
        #define kSUN_BRIGHTNESS _SUN     // Sun brightness

        static const float kKrESun = kRAYLEIGH * kSUN_BRIGHTNESS;
        static const float kKmESun = kMIE * kSUN_BRIGHTNESS;
        static const float kKr4PI = kRAYLEIGH * 4.0 * 3.14159265;
        static const float kKm4PI = kMIE * 4.0 * 3.14159265;
        static const float kScale = 1.0 / (OUTER_RADIUS - 1.0);
        static const float kScaleDepth = 0.25;
        static const float kScaleOverScaleDepth = (1.0 / (OUTER_RADIUS - 1.0)) / 0.25;
        static const float kSamples = 2.0; // THIS IS UNROLLED MANUALLY, DON'T TOUCH

        #define MIE_G (-0.990)
        #define MIE_G2 0.9801


        struct appdata_t {
            float4 vertex : POSITION;
        };

        struct v2f {
                float4 pos : SV_POSITION;
                half3 rayDir : TEXCOORD0;    // Vector for incoming ray, normalized ( == -eyeRay )
                half3 cIn : TEXCOORD1;         // In-scatter coefficient
                half3 cOut : TEXCOORD2;        // Out-scatter coefficient
           };

        float scale(float inCos)
        {
            float x = 1.0 - inCos;
            return 0.25 * exp(-0.00287 + x*(0.459 + x*(3.83 + x*(-6.80 + x*5.25))));
        }

        v2f vert (appdata_t v)
        {
            v2f OUT;
            OUT.pos = mul(UNITY_MATRIX_MVP, v.vertex);

            float3 cameraPos = float3(0,kInnerRadius + kCameraHeight,0);     // The camera's current position
       
            // Get the ray from the camera to the vertex and its length (which is the far point of the ray passing through the atmosphere)
            float3 eyeRay = normalize(mul((float3x3)_Object2World, v.vertex.xyz));

            OUT.rayDir = half3(-eyeRay);

            float far = 0.0;
            if(eyeRay.y >= 0.0)
            {
                // Sky
                // Calculate the length of the "atmosphere"
                far = sqrt(kOuterRadius2 + kInnerRadius2 * eyeRay.y * eyeRay.y - kInnerRadius2) - kInnerRadius * eyeRay.y;

                float3 pos = cameraPos + far * eyeRay;
               
                // Calculate the ray's starting position, then calculate its scattering offset
                float height = kInnerRadius + kCameraHeight;
                float depth = exp(kScaleOverScaleDepth * (-kCameraHeight));
                float startAngle = dot(eyeRay, cameraPos) / height;
                float startOffset = depth*scale(startAngle);
               
           
                // Initialize the scattering loop variables
                float sampleLength = far / kSamples;
                float scaledLength = sampleLength * kScale;
                float3 sampleRay = eyeRay * sampleLength;
                float3 samplePoint = cameraPos + sampleRay * 0.5;

                // Now loop through the sample rays
                float3 frontColor = float3(0.0, 0.0, 0.0);
                // WTF BBQ: WP8 and desktop FL_9_1 do not like the for loop here
                // (but an almost identical loop is perfectly fine in the ground calculations below)
                // Just unrolling this manually seems to make everything fine again.
//                for(int i=0; i<int(kSamples); i++)
                {
                    float height = length(samplePoint);
                    float depth = exp(kScaleOverScaleDepth * (kInnerRadius - height));
                    float lightAngle = dot(_WorldSpaceLightPos0.xyz, samplePoint) / height;
                    float cameraAngle = dot(eyeRay, samplePoint) / height;
                    float scatter = (startOffset + depth*(scale(lightAngle) - scale(cameraAngle)));
                    float3 attenuate = exp(-scatter * (kInvWavelength * kKr4PI + kKm4PI));

                    frontColor += attenuate * (depth * scaledLength);
                    samplePoint += sampleRay;
                }
                {
                    float height = length(samplePoint);
                    float depth = exp(kScaleOverScaleDepth * (kInnerRadius - height));
                    float lightAngle = dot(_WorldSpaceLightPos0.xyz, samplePoint) / height;
                    float cameraAngle = dot(eyeRay, samplePoint) / height;
                    float scatter = (startOffset + depth*(scale(lightAngle) - scale(cameraAngle)));
                    float3 attenuate = exp(-scatter * (kInvWavelength * kKr4PI + kKm4PI));

                    frontColor += attenuate * (depth * scaledLength);
                    samplePoint += sampleRay;
                }



                // Finally, scale the Mie and Rayleigh colors and set up the varying variables for the pixel shader
                OUT.cIn.xyz = frontColor * (kInvWavelength * kKrESun);
                OUT.cOut = frontColor * kKmESun;
            }
            else
            {
                // Ground
                far = (-kCameraHeight) / (min(-0.00001, eyeRay.y));

                float3 pos = cameraPos + far * eyeRay;

                // Calculate the ray's starting position, then calculate its scattering offset
                float depth = exp((-kCameraHeight) * (1.0/kScaleDepth));
                float cameraAngle = dot(-eyeRay, pos);
                float lightAngle = dot(_WorldSpaceLightPos0.xyz, pos);
                float cameraScale = scale(cameraAngle);
                float lightScale = scale(lightAngle);
                float cameraOffset = depth*cameraScale;
                float temp = (lightScale + cameraScale);
               
                // Initialize the scattering loop variables
                float sampleLength = far / kSamples;
                float scaledLength = sampleLength * kScale;
                float3 sampleRay = eyeRay * sampleLength;
                float3 samplePoint = cameraPos + sampleRay * 0.5;
               
                // Now loop through the sample rays
                float3 frontColor = float3(0.0, 0.0, 0.0);
                float3 attenuate;
                for(int i=0; i<int(kSamples); i++)
                {
                    float height = length(samplePoint);
                    float depth = exp(kScaleOverScaleDepth * (kInnerRadius - height));
                    float scatter = depth*temp - cameraOffset;
                    attenuate = exp(-scatter * (kInvWavelength * kKr4PI + kKm4PI));
                    frontColor += attenuate * (depth * scaledLength);
                    samplePoint += sampleRay;
                }
           
                OUT.cIn.xyz = frontColor * (kInvWavelength * kKrESun + kKmESun);
                OUT.cOut.xyz = clamp(attenuate, 0.0, 1.0);
            }


            return OUT;

        }


        // Calculates the Mie phase function
        half getMiePhase(half eyeCos, half eyeCos2)
        {
            half temp = 1.0 + MIE_G2 - 2.0 * MIE_G * eyeCos;
            // A somewhat rough approx for :
            // temp = pow(temp, 1.5);
            temp = smoothstep(0.0, 0.01, temp) * temp;
            temp = max(temp,1.0e-4); // prevent division by zero, esp. in half precision
            return 1.5 * ((1.0 - MIE_G2) / (2.0 + MIE_G2)) * (1.0 + eyeCos2) / temp;
        }

        // Calculates the Rayleigh phase function
        half getRayleighPhase(half eyeCos2)
        {
            return 0.75 + 0.75*eyeCos2;
        }

        half4 frag (v2f IN) : SV_Target
        {
            half3 col;
            if(IN.rayDir.y < 0.0)
            {
                half eyeCos = dot(_WorldSpaceLightPos0.xyz, normalize(IN.rayDir.xyz));
                half eyeCos2 = eyeCos * eyeCos;
                col = getRayleighPhase(eyeCos2) * IN.cIn.xyz + getMiePhase(eyeCos, eyeCos2) * IN.cOut * _LightColor0.xyz;
            }
            else
            {
                col = IN.cIn.xyz + _GroundColor * IN.cOut;
            }
            //Adjust color from HDR
            col *= _HdrExposure;
            return half4(col,1.0);

        }
        ENDCG
    }
}    


Fallback Off

}
1 Like

Thanks a lot, guys. I’ll give that a try when I get a chance.

FWIW, much preferred the procedural skybox shader when all the colours (sky/land/horizon/sun) were easily definable - for game jam settings it makes it significantly easier to throw in a random sky. It’s gone from being wonderful to being…I guess yeah, not so wonderful - for me it’s less flexible now than just using an old-style preset skybox and colouring the material. (I used the colourable one for several short projects in the time it was included)

Thanks for the help, guys. One question:

I got to this part, but am hung up on the last step: Set that material as a skybox. How do I that exactly? I have created the material and have set the shader to “Skybox” which I’ve created as instructed, but don’t see how to set that material as a skybox. If I fiddle with the HDR exposure setting I can see it change in the inspector, but it’s not actually going over to the game itself. It’s like it’s using the default skybox anyway and ignoring this new one, something like that.

Oh, nevermind, I just had to drag it onto the sky in the preview window. Thanks again for the help, guys. Much appreciated. :slight_smile:

For those interested: It seems like Beta 15 added support for a Tint color as well as an Atmospheric density value to the procedural Skybox. That was fast.

2 Likes

Beta 16 will apparently have more control again.
One of my bugtracker tickets got an update with the note from QA team that b16 will have this resolved.

Thank you, Unity. :slight_smile:

1 Like

yaaaay - gonna wait for the stable release, but yaaaay