EDIT: Issue semi resolved, and culprit mostly determined, but I still don’t understand why. I’d appreciate if anybody who knows could explain why it wasn’t working. It’s all in the reply I just posted
I’ve used the shader graph a bit, but I’m completely new to writing shaders through scripting, so please bear with me if this is an obvious fix or if I’m fundamentally misunderstanding something.
I’ve gotten as far as getting my effect to render to the screen properly (I’ve already got a setup working for render passes and everything), my problem is just about getting the screen-space UVs of the shader to appear. This is what it looks like in the frame debugger:
The problem is, it shouldn’t be just blue. Here’s the shader code:
Shader "Hidden/TestSimpleShader"
{
HLSLINCLUDE
#include "Packages/com.unity.render-pipelines.universal/ShaderLibrary/Core.hlsl"
#include "Packages/com.unity.render-pipelines.core/Runtime/Utilities/Blit.hlsl"
float4 Test (Varyings input) : SV_Target
{
return float4(input.texcoord.x, input.texcoord.y, 1, 1);
}
ENDHLSL
SubShader
{
Tags { "RenderType"="Opaque" "RenderPipeline" = "UniversalPipeline"}
LOD 100
Cull Off ZWrite Off ZTest Always
Pass
{
Name "Simple Test"
HLSLPROGRAM
#pragma vertex Vert
#pragma fragment Test
ENDHLSL
}
}
}
As you can see, I’d expect the red and green channels to be the UVs of the screen, But no matter what brightness range I set it to, the channels are completely zeroed.
I’ve recreated this shader in the shader graph to make sure I wasn’t going insane:
And if I plug the shader graph into the shader slot on my render feature, it works exactly as I would expect:
So I really don’t understand what I’m doing wrong in the shader, and since my google searches and attempts to find documentation for this were repeatedly turning up nothing, I’m a little lost. The shader code is mostly a simplified version of the shader code here.
And to cover all my grounds, here’s the render feature/pass code. This isn’t compatibility mode; I’m using the render graph.
using UnityEngine;
using UnityEngine.Rendering.RenderGraphModule;
using UnityEngine.Rendering;
using UnityEngine.Rendering.Universal;
[System.Serializable]
public class TestSimpleShaderRenderFeature : ScriptableRendererFeature
{
public Shader shader;
private TestSimpleShaderPass pass;
public override void Create()
{
pass = new(name);
}
protected override void Dispose(bool disposing)
{
pass.Dispose();
}
public override void AddRenderPasses(ScriptableRenderer renderer, ref RenderingData renderingData)
{
if (renderingData.cameraData.cameraType == CameraType.Preview
|| renderingData.cameraData.cameraType == CameraType.Reflection
|| UniversalRenderer.IsOffscreenDepthTexture(ref renderingData.cameraData))
return;
if (shader == null) return;
pass.renderPassEvent = RenderPassEvent.AfterRenderingTransparents;
pass.SetupMembers(shader);
pass.requiresIntermediateTexture = false;
renderer.EnqueuePass(pass);
}
internal class TestSimpleShaderPass : ScriptableRenderPass
{
private Material mat;
public TestSimpleShaderPass(string passName)
{
profilingSampler = new ProfilingSampler(passName);
}
public void SetupMembers(Shader shader)
{
mat = CoreUtils.CreateEngineMaterial(shader);
}
public void Dispose()
{
CoreUtils.Destroy(mat);
}
public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
{
UniversalResourceData resourcesData = frameData.Get<UniversalResourceData>();
using (var builder = renderGraph.AddRasterRenderPass<PassData>("Test Shader Pass", out var passData, profilingSampler))
{
passData.material = mat;
builder.SetRenderAttachment(resourcesData.activeColorTexture, 0, AccessFlags.Write);
builder.SetRenderAttachmentDepth(resourcesData.activeDepthTexture, AccessFlags.Write);
builder.SetRenderFunc((PassData data, RasterGraphContext rgContext) => rgContext.cmd.DrawProcedural(Matrix4x4.identity, data.material, 0, MeshTopology.Triangles, 3, 1));
}
}
private class PassData
{
internal Material material;
}
}
}
And if it’s at all relevant, this is Unity 6000.7f1 with URP 17.0.3.
Any and all help is appreciated!