[Question] How do you make textures animated in shader code?

How do you make textures animated in shader code?

I found this example:

Shader "Mattatz/TextureAnimation"
{
    Properties
    {
        _MainTex ("Texture", 2D) = "white" {}
        _Color ("Color", Color) = (1, 1, 1, 1)

        _Cols ("Cols Count", Int) = 5
        _Rows ("Rows Count", Int) = 3
        _Frame ("Per Frame Length", Float) = 0.5
    }

    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 100

        Pass
        {
            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
           
            #include "UnityCG.cginc"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            sampler2D _MainTex;
            float4 _MainTex_ST;

            fixed4 _Color;

            uint _Cols;
            uint _Rows;

            float _Frame;
           
            fixed4 shot (sampler2D tex, float2 uv, float dx, float dy, int frame) {
                return tex2D(tex, float2(
                    (uv.x * dx) + fmod(frame, _Cols) * dx,
                    1.0 - ((uv.y * dy) + (frame / _Cols) * dy)
                ));
            }
           
            v2f vert (appdata v) {
                v2f o;
                o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                return o;
            }
           
            fixed4 frag (v2f i) : SV_Target {
                int frames = _Rows * _Cols;
                float frame = fmod(_Time.y / _Frame, frames);
                int current = floor(frame);
                float dx = 1.0 / _Cols;
                float dy = 1.0 / _Rows;

                // not lerping to next frame
                // return shot(_MainTex, i.uv, dx, dy, current) * _Color;

                int next = floor(fmod(frame + 1, frames));
                return lerp(shot(_MainTex, i.uv, dx, dy, current), shot(_MainTex, i.uv, dx, dy, next), frame - current) * _Color;
            }

            ENDCG
        }
    }
}

I am by no means a code expert, but from what I gather this is taking variables adding them to a shader, which still takes a single main texture sheet, but has functions which are called later by code (i’m guessing placed on the same component with this material/shader) to animated the images cut out from the sheets using the

v2f vert (appdata v) {
                v2f o;
                o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
                o.uv = TRANSFORM_TEX(v.uv, _MainTex);
                return o;
            }
          
            fixed4 frag (v2f i) : SV_Target {
                int frames = _Rows * _Cols;
                float frame = fmod(_Time.y / _Frame, frames);
                int current = floor(frame);
                float dx = 1.0 / _Cols;
                float dy = 1.0 / _Rows;

                // not lerping to next frame
                // return shot(_MainTex, i.uv, dx, dy, current) * _Color;

                int next = floor(fmod(frame + 1, frames));
                return lerp(shot(_MainTex, i.uv, dx, dy, current), shot(_MainTex, i.uv, dx, dy, next), frame - current) * _Color;

functions?

not 100% on wall of this but I am wondering if this is the normal way, apparently this shader uses structs with uv information to display from the MainTex while calling code functions which determine what part of the texture to read a shot from during rendering, I am trying to study it more to learn it all,

does anyone know where in the documentation you can learn how to animate textures in your shaders? I didn’t see any examples and I am basically trying to make animated texture materials within my multi-layered shaders for special effects and other things. (Also testing purposes)

The main idea I am reading on in my research of this topic is whatever information is in the shader code to allow the textures to animate will be called on later via a component script to control the rendering on that object with that specific shader/material, is that correct?

Thank you for your time to anyone who read this, and have a great day.

I would suggest you go through some beginner tutorials or youtube videos on shaders. They’ll answer most of your questions, or at least get you thinking about the individual parts of the shader better.

These two are good places to start.

The short high level explanation of the above shader is it’s taking _Time.y, a global shader value set by Unity that is equivalent to c# Time.timeSinceLevelLoad, and uses a bit of math to convert that into a frame number. Then uses a bit more math to calculate a UV offset to select that frame from a “flipbook” texture atlas.

1 Like

Thanks! Great video!