Hi,
I recently upgraded a project from built-in to URP and noticed some artifacts from one shader after manually upgrading it. The shader is used for a Bayer pattern conversion (where R, G, B are all encoded into the red channel according to a specific spatial distribution) by blitting from RenderTexture.active to another RenderTexture.
The artifacts I’m seeing are dark pixels only on the diagonal going from the lower left to the upper right corner of the destination texture. Through much trial and error, I’ve found out the problem comes from a combination of three things:
- The RenderTexture used as input (when I apply the material to a quad and use a normal image texture I never get the artifacts)
- The fact that the input texture is sampled with single-pixel offsets like so:
// _ScreenParams.z/.w contain 1.0 + 1.0 / <screen width/height>
half dx = _ScreenParams.z - 1.0;
half dy = _ScreenParams.w - 1.0;
half4 col = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv);
col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(dx, 0));
col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0));
col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(0, dy));
col.rgb += SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(0, -dy));
col = col / 5.0;
Specifically, if I only do col.rgb = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(dx, 0));
I don’t see artifacts, if I do col.rgb = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0));
(minus dx), the artifacts appear.
- Either col.r, col.g or col.b are assigned to the red channel of the result color based on the uv position; the artifacts only appear when the blue or green channel of col are used, never with the red channel.
Here’s a stripped down version of the shader that still shows the artifacts and the script I use to blit with it below:
Shader "Custom/TestShader"
{
Properties
{
_BaseMap("Texture", 2D) = "black" {}
}
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline" }
LOD 100
ZWrite Off ZTest Always Blend Off Cull Off
Pass
{
HLSLPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
v2f vert(appdata v)
{
v2f o;
o.vertex = TransformObjectToHClip(v.vertex.xyz);
o.uv = v.uv;
return o;
}
TEXTURE2D(_BaseMap);
SAMPLER(sampler_BaseMap);
half4 frag(v2f i) : SV_Target
{
half dx = _ScreenParams.z - 1.0;
half4 col = SAMPLE_TEXTURE2D(_BaseMap, sampler_BaseMap, i.uv + half2(-dx, 0));
half4 ret = {0.0, 0.0, 0.0, 1.0};
ret.r = col.g;
return ret;
}
ENDHLSL
}
}
}
using UnityEngine;
using UnityEngine.Rendering;
public class TestBlitScript : MonoBehaviour
{
private Camera _camera;
private Material _mat;
public Shader shader;
public void Start()
{
_camera = GetComponent<Camera>();
_mat = new Material(shader);
RenderPipelineManager.endCameraRendering += TestBlit;
}
private void TestBlit(ScriptableRenderContext context, Camera camera) {
if (camera != _camera) return; // endCameraRendering is called for all cameras, only proceed for the one on this game object
_mat.SetTexture(Shader.PropertyToID("_BaseMap"), RenderTexture.active);
Graphics.Blit(RenderTexture.active, camera.targetTexture, _mat);
}
public void OnDestroy()
{
RenderPipelineManager.endCameraRendering -= TestBlit;
}
}
And finally here’s what I get with an unlit white plane in front of the camera and the shader and script from above, along with the settings for the destination RenderTexture (the one set as camera.targetTexture):
Any ideas on how to get rid of that dark diagonal line?
Thanks!