Sampling (Render-)texture with offset UVs produces artifacts

Hi,

I recently upgraded a project from built-in to URP and noticed some artifacts from one shader after manually upgrading it. The shader is used for a Bayer pattern conversion (where R, G, B are all encoded into the red channel according to a specific spatial distribution) by blitting from RenderTexture.active to another RenderTexture.
The artifacts I’m seeing are dark pixels only on the diagonal going from the lower left to the upper right corner of the destination texture. Through much trial and error, I’ve found out the problem comes from a combination of three things:

  • The RenderTexture used as input (when I apply the material to a quad and use a normal image texture I never get the artifacts)
  • The fact that the input texture is sampled with single-pixel offsets like so:
// _ScreenParams.z/.w contain 1.0 + 1.0 / <screen width/height>
                half dx = _ScreenParams.z - 1.0;
                half dy = _ScreenParams.w - 1.0;

                half4 col = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv);

                col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(dx, 0));
                col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0));
                col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(0, dy));
                col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(0, -dy));
                col = col / 5.0;

Specifically, if I only do col.rgb = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(dx, 0)); I don’t see artifacts, if I do col.rgb = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0)); (minus dx), the artifacts appear.

  • Either col.r, col.g or col.b are assigned to the red channel of the result color based on the uv position; the artifacts only appear when the blue or green channel of col are used, never with the red channel.

Here’s a stripped down version of the shader that still shows the artifacts and the script I use to blit with it below:

Shader "Custom/TestShader"
{
    Properties
    {
        _BaseMap("Texture", 2D) = "black" {}
    }

    SubShader
    {
        Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline" }
        LOD 100
        ZWrite Off ZTest Always Blend Off Cull Off

        Pass
        {
            HLSLPROGRAM
            #pragma vertex vert
            #pragma fragment frag
           
            #include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"

            struct appdata
            {
                float4 vertex : POSITION;
                float2 uv : TEXCOORD0;
            };

            struct v2f
            {
                float2 uv : TEXCOORD0;
                float4 vertex : SV_POSITION;
            };

            v2f vert(appdata v)
            {
                v2f o;
                o.vertex = TransformObjectToHClip(v.vertex.xyz);
                o.uv = v.uv;
                return o;
            }
           
            TEXTURE2D(_BaseMap);
            SAMPLER(sampler_BaseMap);

            half4 frag(v2f i) : SV_Target
            {
                half dx = _ScreenParams.z - 1.0;

                half4 col = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0));

                half4 ret = {0.0, 0.0, 0.0, 1.0};
                ret.r = col.g;
                return ret;
            }
            ENDHLSL
        }
    }
}
using UnityEngine;
using UnityEngine.Rendering;

public class TestBlitScript : MonoBehaviour
{
    private Camera _camera;
    private Material _mat;
    public Shader shader;

    public void Start()
    {
        _camera = GetComponent<Camera>();
        _mat = new Material(shader);
        RenderPipelineManager.endCameraRendering += TestBlit;
    }

    private void TestBlit(ScriptableRenderContext context, Camera camera) {
        if (camera != _camera) return; // endCameraRendering is called for all cameras, only proceed for the one on this game object
        _mat.SetTexture(Shader.PropertyToID("_BaseMap"), RenderTexture.active);
        Graphics.Blit(RenderTexture.active, camera.targetTexture, _mat);
    }

    public void OnDestroy()
    {
        RenderPipelineManager.endCameraRendering -= TestBlit;
    }
}

And finally here’s what I get with an unlit white plane in front of the camera and the shader and script from above, along with the settings for the destination RenderTexture (the one set as camera.targetTexture):


Any ideas on how to get rid of that dark diagonal line?
Thanks!

Hi!
Try this:

float dx = _ScreenParams.z - 1.0;
half4 col = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + float2(-dx, 0));

Thanks for the suggestion!
However the artifacts are still there. I also tried changing all the colors in the fragment shader to float4’s, no luck either.

You should keep the UVs as floats anyway :slight_smile:
Which graphics API are you running on?

Good point of course, thanks.
OpenGLCore, target platform is Linux.

In this case half and float are the same.

Are you sure the data in the texture is fine?

:smile: I’m not sure of most things anymore.
However I tested a few other things, thinking maybe the auto-update to URP might have messed something up. So I created a new scene and a new RenderTexture and hooked them up the same way which gets me… the same result. Interestingly, I also had the camera render to the RenderTexture without any blitting and used the texture in a material with the problematic shader on a quad and don’t get the artifacts then. So the problem seems to only occur when blitting.

Odd.
It sounds worth a bug report to me :slight_smile:

I made a bug report on that day (20.5.), in the meantime I also tried a few other Unity versions I have installed plus the latest 2021 LTS one. I get the same artifacts in 2019.4.24f1, 2020.3.31f1, 2021.3.1f1 and 2021.3.4f1.