Point FilterMode

Hi there,

I am working on a shader to create a Post Effects.
It should zoom or refract the screen in some way.
To achieve this i want to use a texture which contains the distortion information.
I thought, in that way i can save gpu time because the effect is precalculated into the texture.
Also the effects can be “drawn” without coming up with some math functions.

So I encode x distortion in the red channel and y in green.
I added a property to control the strength of the effect (just multiplying the texture samples).
But when I set the FilterMode of the distortion texture to Point i am getting artifacts. Especially when i increase the strength.
My projects are mostly 2D and Pixel graphics, so in some cases I need to have the distortion with Point filter.
For debugging purposes i only focused on x stretch with a linear function.

Here are some images:
The distortion map
7925416--1011808--redLinear.png
Result No effect applied


Result effect applied with distortion map FilterMode Bilinear

Result effect applied with distortion map FilterMode Point

Debug result showing the Color/UV taken form distortion map with FilterMode Point
I increased contrast to make the steps better visible

The artifacts are looking like a sort of line dithering transitioning between the color steps.
I do not understand why these artifacts appear.

Here is the shader code:

Shader "Custom/Zoom"
{
Properties
    {
        _TestTex ("Test (RGB)", 2D) = "white" {}
        _DistortionTex ("Distortion RG -> XY", 2D) = "black" {}
        _DistorionStrength ("Distortion strength", Range(0.0000, 1)) = 0.01
        [Toggle(DISTORTION_FROM_TEXTURE)]_DisotrionFromTexture ("Distortion from Texture", Float) = 0
        [Toggle(DEBUG_UV)]_DebugUV ("Debug UV", Float) = 0
    }
    SubShader
    {
        Tags { "RenderType"="Opaque" }
        LOD 200

        Pass
        {
            Name "Zoom"
            HLSLPROGRAM
            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/SurfaceInput.hlsl"
            #include "Packages/com.unity.render-pipelines.core/ShaderLibrary/Color.hlsl"

            #pragma shader_feature DISTORTION_FROM_TEXTURE
            #pragma shader_feature DEBUG_DISTORTION
            #pragma shader_feature DEBUG_UV

            TEXTURE2D(_TestTex);
            SAMPLER(sampler_TestTex);
            TEXTURE2D(_DistortionTex);
            SAMPLER(sampler_DistortionTex);

            CBUFFER_START(UnityPerMaterial)
            float _DistorionStrength;
            CBUFFER_END

            struct Attributes
            {
                float4 positionOS       : POSITION;
                float2 uv               : TEXCOORD0;
            };

            struct Varyings
            {
                float2 uv        : TEXCOORD0;
                float4 vertex : SV_POSITION;
                UNITY_VERTEX_OUTPUT_STEREO
            };

            Varyings vert(Attributes input)
            {
                Varyings output = (Varyings)0;

                VertexPositionInputs vertexInput = GetVertexPositionInputs(input.positionOS.xyz);
                output.vertex = vertexInput.positionCS;
                output.uv = input.uv;

                return output;
            }

            float2 distortUV(float2 uv)
            {
                float2 delta = uv;
                delta.r -= 0.5;
                delta.g = 0;

                float2 deltaSign = sign(delta);
                return uv - delta * _DistorionStrength;
            }

            float2 distortUV(float2 uv, float2 distortion)
            {
                float2 delta = distortion;
                delta.r -= 0.5;
                delta.g = 0;

                // float2 deltaSign = sign(delta);
                return uv - delta * _DistorionStrength;
            }

            half4 frag (Varyings input) : SV_Target
            {
                #ifdef DISTORTION_FROM_TEXTURE
                float4 distorionCol = SAMPLE_TEXTURE2D(_DistortionTex, sampler_DistortionTex, input.uv);
                float2 distortedUV = distortUV(input.uv, distorionCol.rg);
                #else
                float2 distortedUV = distortUV(input.uv);
                #endif


                #ifdef DEBUG_UV
                    half4 col = half4(distortedUV.r, 0, 0, 1);
                    col = col * 1.5;
                    // col = pow(col, 6) * 32;
                #else
                    half4 col = SAMPLE_TEXTURE2D(_TestTex, sampler_TestTex, distortedUV);
                #endif

                return col;
            }

            #pragma vertex vert
            #pragma fragment frag

            ENDHLSL
        }
    }
    FallBack "Diffuse"
}

What format is the texture?

I made sure that i do not have any compression or so in the distortion map.
So i tried it with uncompressed versions in the form of BMP (24 RGB) and also with a TIF (32 RGBA).
In the Unity import settings i selected None for compression.
Same experience for those 2 formats.

Both of those formats are 8 bits per channel, and will result in an 8 bit per channel uncompressed format once imported into Unity. 8 bits is can hold an integer range between 0-255. That’s not enough precision to be used directly as texture UVs, unless you’re only sampling a texture that’s 256x256 pixels.

You either need at least a 16 bit precision image to use for this to work the way you want. Though a better alternative would be to use the texture just to record the offset from the original UV, rather than the actual UV itself. That way as long as your distortion isn’t doing more than +/- 126 pixels of distortion, there’s enough precision in an 8 bit per channel image.

If you still want to use an image for the UVs, or need more distortion than 126 pixels, the only formats you can use is either an exr or hdr file. Though be warned Unity can apply gamma correction to these formats if you’re not using a linear color space project. You do not want that gamma correction for this use case, and cannot be disabled. So you’d have to import the image manually via script and decode the image file yourself, or generate the UV texture in Unity using an RGHalf or RGFloat format, and save it as an asset or regenerate it every you start the program.

However, really, doing the math in the shader is probably faster than using a dependent texture read (a texture sample that relies on the results of another texture sample).

1 Like