Why does my ScriptableRenderPass require Post-processing?

Hey, I’m fairly new to the URP and migrated some old CommandBuffer code to a ScriptableRenderPass and ScriptableRendererFeature (because I assume this is what I need to do in the new system).

public class OutlineRendererFeature : ScriptableRendererFeature
{
	[System.Serializable]
	public class CustomRenderObjectsSettings
	{
		// And Post-processing must be enabled for unknown reasons.
		public float blurRadius = 1f;
		public Color outlineColor = new (1f, 0.62f, 0.25f, 0.09f);
	}

	public CustomRenderObjectsSettings settings = new();
	private CustomRenderObjectsPass renderObjectsPass;

	public override void Create()
	{
		Material material = new Material(Shader.Find("Xarbrough/UnityOutline"));
		renderObjectsPass = new CustomRenderObjectsPass(
			RenderPassEvent.AfterRendering, 
			material, 
			settings.blurRadius, 
			settings.outlineColor);
	}

	public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
	{
		renderer.EnqueuePass(renderObjectsPass);
	}
}

The renderer feature basically replaces this part from the builtin pipeline:

this.camera.AddCommandBuffer(cameraEvent, commandBuffer);

Previously I was able to add all render instruction directly to my camera (the scene view camera in this case).

The new render pass contains all of my previous CommandBuffer logic like so:

public class CustomRenderObjectsPass : ScriptableRenderPass
{
	private Material outlineMaterial;
	private float blurRadius;
	private Color outlineColor;

	private int blurredID;
	private int temporaryID;
	private int depthID;
	private int idID;

	public CustomRenderObjectsPass(
		RenderPassEvent renderPassEvent, Material outlineMaterial, float blurRadius, Color outlineColor)
	{
		this.renderPassEvent = renderPassEvent;
		this.outlineMaterial = outlineMaterial;
		this.blurRadius = blurRadius;
		this.outlineColor = outlineColor;

		depthID = Shader.PropertyToID("_DepthRT");
		blurredID = Shader.PropertyToID("_BlurredRT");
		temporaryID = Shader.PropertyToID("_TemporaryRT");
		idID = Shader.PropertyToID("_idRT");
	}

	public override void Execute(ScriptableRenderContext context, ref RenderingData renderingData)
	{
		CommandBuffer cmd = CommandBufferPool.Get("Render Objects with Custom Shader");

		var camera = renderingData.cameraData.camera;
		int rtWidth = camera.pixelWidth;
		int rtHeight = camera.pixelHeight;

		cmd.GetTemporaryRT(depthID, rtWidth, rtHeight, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);
		cmd.SetRenderTarget(depthID, BuiltinRenderTextureType.CurrentActive);
		cmd.ClearRenderTarget(false, true, Color.clear);

		// Render selected objects into a mask buffer
		RenderTargetsToMaskBuffer(cmd);

		// Prepass on object ID to discover edges between roots
		cmd.GetTemporaryRT(idID, rtWidth, rtHeight, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);
		cmd.Blit(depthID, idID, outlineMaterial, 3);

		// Blur mask in two separable passes
		cmd.GetTemporaryRT(temporaryID, rtWidth, rtHeight, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);
		cmd.GetTemporaryRT(blurredID, rtWidth, rtHeight, 0, FilterMode.Bilinear, RenderTextureFormat.ARGB32);
		cmd.Blit(idID, blurredID);

		cmd.SetGlobalVector("_BlurDirection", new Vector2(blurRadius, 0));
		cmd.Blit(blurredID, temporaryID, outlineMaterial, 2);
		cmd.SetGlobalVector("_BlurDirection", new Vector2(0, blurRadius));
		cmd.Blit(temporaryID, blurredID, outlineMaterial, 2);

		// Blend outline over existing scene image
		cmd.SetGlobalColor("_OutlineColor", outlineColor);
		cmd.Blit(blurredID, BuiltinRenderTextureType.CameraTarget, outlineMaterial, 4);

		context.ExecuteCommandBuffer(cmd);
		CommandBufferPool.Release(cmd);
	}

	private void RenderTargetsToMaskBuffer(CommandBuffer cmd)
	{
		foreach (var renderer in Object.FindObjectsOfType<Renderer>())
		{
			cmd.DrawRenderer(renderer, outlineMaterial, 0, 1);
			cmd.DrawRenderer(renderer, outlineMaterial, 0, 0);
		}
	}
}

And it works, but only if Post-processing is enabled in the Renderer asset and on the Camera.

I don’t understand why and I’m trying to get rid of this dependency, if possible, because I’m trying to make my outline shader work as part of a package that should not interfere with user settings. I don’t want to force users to enable this setting on their render pipeline asset.

Here’s the frame debugger when post-processing is enabled:

And here when the setting is disabled:

It seems, the “FinalBlit” overwrites my own rendering completely. However, I’m already using RenderPassEvent.AfterRendering, so I’m unsure why there’s anything later than that.

Thanks for any pointers! :slight_smile:

try setting ScriptableRenderPass.requiresIntermediateTexture = true on your custom pass.

Thanks for the suggestion, it didn’t immediately work though, however it prompted me to try something else:

// Blend outline over existing scene image
cmd.SetGlobalColor("_OutlineColor", outlineColor);
var colorTarget = renderingData.cameraData.renderer.cameraColorTarget;
cmd.Blit(blurredID, colorTarget, outlineMaterial, 4);

Instead of blitting with the BuiltinRenderTextureType.CameraTarget, I tried directly rendering into the cameraColorTarget, because that was showing up in the Frame Debugger. This seems to work fine, not sure if there are any drawbacks though (or cases where this would break).

Another strange way to avoid my issue:

renderObjectsPass = new CustomRenderObjectsPass(
			RenderPassEvent.AfterRendering + 500,
			material, 
			settings.blurRadius, 
			settings.outlineColor);

Apparently, the RenderPassEvents are not really limited to the enum values, as it seems to be possible to just add a number to make the event occur later (in this case after the FinalBlit). But I realized that this is a total hack and probably only works by chance.