Rotate/UV transform a projection shader

I’m sorry if this has been asked before, but I’ve only been able to find similar threads thus far and nothing that gets me exactly what I need.

I need a shader that can both project onto a surface via a Unity projector and rotate UVs. The goal is a sort of radar ‘blip’ animation that gets projected onto my environment. Thus far I’ve had success either projecting the texture or rotating it, but not both (see comments in the code block below).

I’m very new to shader scripting and most of this is cobbled together from other shaders I’ve seen. I’m sure it could be optimized a bit.

Thanks in advance for any insights!

Shader "Custom/RadarBlip" {
    Properties {
        _Color ("Tint Color", Color) = (1,1,1,1)
        _Attenuation ("Falloff", Range(0.0, 1.0)) = 1.0
        _RotationTex ("Rotation Texture", 2D) = "gray" {}
        _RotationSpeed("Rotation Speed", Float) = 0.0
    }
    Subshader {
        Tags {"Queue"="Transparent"}

        Pass {
            ZWrite Off
            ColorMask RGB
            Blend SrcAlpha One
            Offset -1, -1

            CGPROGRAM
            #pragma vertex vert
            #pragma fragment frag
            #include "UnityCG.cginc"
            sampler2D _RotationTex;
            float _RotationSpeed;

            float4x4 unity_Projector;

            fixed4 _Color;
            float _Attenuation;

            struct appdata {
                float4 vertex : POSITION;
                float4 texcoord : TEXCOORD0;
            };
            struct v2f {
                float4 pos : SV_POSITION;
                float4 uvShadow : TEXCOORD0;
                float2 uv : TEXCOORD1;
            };

            v2f vert (appdata v) {

                float s = sin ( _RotationSpeed * _Time);
                float c = cos ( _RotationSpeed * _Time);
                float2x2 rotationMatrix = float2x2( c, -s, s, c);
                float offsetX = .5;
                float offsetY = .5;
                float x = v.texcoord.x - offsetX;
                float y = v.texcoord.y - offsetY;

                v2f o;
                o.uv = mul (float2(x, y), rotationMatrix ) + float2(offsetX, offsetY);
                o.pos = UnityObjectToClipPos (v.vertex);
                o.uvShadow = mul (unity_Projector, v.vertex);

                return o;
            }

            fixed4 frag (v2f i) : SV_Target {
                    
                // This returns a rotating texture, but it's not projected correcty -- it's giant and fills the entire surface of anything beneath it.
                return tex2D(_RotationTex, i.uv);

                // This returns a correct projection, but it doesn't rotate anymore.
                // I need to merge these two commands.

                fixed4 texCookie = tex2Dproj (_RotationTex, UNITY_PROJ_COORD(i.uvShadow));
                fixed4 outColor = _Color * texCookie.a;
                float depth = i.uvShadow.z;
                return outColor * clamp(1.0 - abs(depth) + _Attenuation, 0.0, 1.0);
            }
            ENDCG
        }
    }
}

Still stuck on this – does anyone have any ideas?

You need to rotate the projected UV. The easiest way to go about that would be to rotate the projected UV after the perspective divide.

That’s going to take a long time to explain, and if you’re curious look up the processes of transforming from camera or view space, to homogeneous clip space, to normalized device space. The short version is tex2Dproj(_tex, i.uvShadow)* can be rewritten like this:

tex2D(_tex, i.uvShadow.xy / i.uvShadow.w);

That divide by w is the perspective divide, but the result of that divide in this case is the usual float2 texture UV you might be more used to. The easiest solution for you is to do the rotation in the fragment shader on that projected UV.

float s = sin ( _RotationSpeed * _Time);
float c = cos ( _RotationSpeed * _Time);
float2x2 rotationMatrix = float2x2( c, -s, s, c);

float2 uv = i.uvShadow.xy / i.uvShadow.w;
uv = mul(uv - float2(0.5, 0.5), rotationMatrix) + float2(0.5, 0.5);
fixed4 = tex2D(_RotationTex, uv);

And obviously remove the rotation code from the vertex shader.

  • Note: I skip the UNITY_PROJ_COORD here because it only does something when compiling the shader for the PS Vita.

Thank you so much!! This is extremely helpful.

I’m really interested in learning this stuff but have had a hard time finding good API documentation out there. Something that explains basic HLSL data types and expected values for noobs like me. Anyways, thanks again for the help.

Microsoft has HLSL documentation on their site. Like all things on MSDN it’s slow and painful to navigate, but there’s a lot of good data.

Nvidia’s Cg documentation is nicer for getting a list of useful parts. Note that Unity uses HLSL and not Cg these days, regardless of Unity’s own documentation on the topic. For the most part they’re equivalent, but it’s not 100% identical. If something isn’t working for you when working from the Cg documentation, double check the HLSL documentation.

But that’s just the basics of the shader language itself. For ShaderLab, Unity’s own syntax for defining shader states, and the shader code, there are many resources out there, but Unity’s own documentation can get you started:

Otherwise I usually send people towards Alan Zucconi and CatLike Coding:

For the math of what I did above, that’s not really specific to HLSL, or rendering, that’s just matrix math.

1 Like