MSAA error ("resolve bindMS") with textures formats in render graph scriptable render pass

Out the gate, this is Unity 6000.0.7f1, URP 17.0.3 and relates to a ScriptableRenderFeature I wrote utilising the render graph (no obsolete/legacy stuff).

In my project, I have 2 layers I need more control over the rendering for. As a result, I handle the rendering myself through a render feature. My issue stems from the depth normals part of this process.
These are the two layers:

  1. Character
  2. Overlay

Essentially, I want to render the depth normals/depth for the character pass into the cameraNormalsTexture and activeDepthTexture via my own scripting. I have gotten this to work without issues.
For the overlay, however, I want to have a copy of just the depth for later use, but otherwise also render it into the cameraNormalsTexture and activeDepthTexture just like with the character layer.
Previously, I was just rendering it all twice, and then using that. This is completely unnecessary though, since in theory, I can just make a copy when I first render them out.

So here’s how the new system I wrote works. For reference, the square is the character layer, and the sphere is the overlay layer.

My implementation of this currently “works”, but has a few issues.

  1. MSAA is not enabled on these textures, meaning the edges are very jagged and unappealing. This is pretty noticeable with the models in my game.
  2. We have to create 2 screen-size buffers, one of which I don’t even use, which feels extremely wasteful. I haven’t really tested this on any specific hardware to have concrete proof of anything, but it’s also just a more convoluted way of doing things, so I’d rather streamline the process a little better either way.

This is the way I would prefer to implement the system.

Now in theory, this should also work since the other one was working before. We’ll come back to MSAA in a moment, so here’s what happens as I try to implement this new version.
Error encountered while implementing alternative approach

So the intermediary depth texture (green star) needs to match the cameraNormalsTexture (blue star) in setup, which in this case happens to be MSAA. No problem - I needed MSAA anyways. But in practise,

That error floods the console every time I set the MSAA sample count for these textures. I don’t understand why, googling was giving me nothing, this is clearly a moment where my inexperience is holding me back. It isn’t helping that the Render Graph is very new, and a lot of stuff online is kinda missing about how to use it properly. I think I might’ve jumped into learning this stuff at a bit of an awkward time.

Here’s the code for the previous version that was “working” up until I added the MSAA samples part. This is just the pass responsible for the overlay layer, not the character layer. It’s scheduled BeforeRenderingPrePass.
It’s probably atrocious and commits numerous sins, and those who actually know what they’re doing (unlike me) please offer your feedback on things I’m doing wrong or small optimisations I might be able to make (part of the mess is because it’s not done yet, though). A lot of this is kinda hodge-podge trial-and-error as I just took what I could find and threw it together. I’m also sorry about the scarcity of comments…

internal class BufferDepth : ScriptableRenderPass
{
    public LayerMask layers;
    private Material copyDepthMat;
    private Material transferDepthNormalsMat;
    public BufferDepth(string passName)
    {
        profilingSampler = new ProfilingSampler(passName);
    }
    public void SetupMembers(Shader copyDepthShader, Shader transferDepthShader)
    {
        copyDepthMat = CoreUtils.CreateEngineMaterial(copyDepthShader);
        transferDepthNormalsMat = CoreUtils.CreateEngineMaterial(transferDepthShader);
    }
    private void InitRendererLists(ContextContainer frameData, ref RenderNormalsPassData passData, RenderGraph renderGraph)
    {
        UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
        UniversalRenderingData renderingData = frameData.Get<UniversalRenderingData>();

        passData.rendererListHandle = renderGraph.CreateRendererList(
            new RendererListParams(
                renderingData.cullResults,
                RenderingUtils.CreateDrawingSettings(
                    new ShaderTagId("DepthNormals"),
                    renderingData,
                    cameraData,
                    frameData.Get<UniversalLightData>(),
                    cameraData.defaultOpaqueSortFlags),
                new FilteringSettings(RenderQueueRange.opaque,
                layers)));
    }
    public override void RecordRenderGraph(RenderGraph renderGraph, ContextContainer frameData)
    {
        UniversalCameraData cameraData = frameData.Get<UniversalCameraData>();
        UniversalResourceData resourcesData = frameData.Get<UniversalResourceData>();

        RenderTextureDescriptor normalsDesc = new(cameraData.cameraTargetDescriptor.width, cameraData.cameraTargetDescriptor.height, RenderTextureFormat.ARGBHalf); //Use ARGBHalf because I couldn't find a RenderTextureFormat equivalent for R8G8B8A8_SNorm, and we need the ability to go into the negatives
        RenderTextureDescriptor depthDesc = new(cameraData.cameraTargetDescriptor.width, cameraData.cameraTargetDescriptor.height, RenderTextureFormat.Depth, cameraData.cameraTargetDescriptor.depthBufferBits);

        //This is the part that causes that bindMS error
        normalsDesc.msaaSamples = cameraData.cameraTargetDescriptor.msaaSamples;
        depthDesc.msaaSamples = cameraData.cameraTargetDescriptor.msaaSamples;

        TextureHandle normalsTex = UniversalRenderer.CreateRenderGraphTexture(renderGraph, normalsDesc, "Overlay Depth Normals Buffer", false);
        TextureHandle depthTex = UniversalRenderer.CreateRenderGraphTexture(renderGraph, depthDesc, "Overlay Depth Buffer", false);

        //Render the overlay layer 
        using (var builder = renderGraph.AddRasterRenderPass<RenderNormalsPassData>("Render Overlay Depth Normals", out var passData, profilingSampler))
        {
            InitRendererLists(frameData, ref passData, renderGraph);

            builder.UseRendererList(passData.rendererListHandle);

            builder.SetRenderAttachment(normalsTex, 0);
            builder.SetRenderAttachmentDepth(depthTex);

            builder.SetRenderFunc((RenderNormalsPassData data, RasterGraphContext rgContext) => ExecutePass(data, rgContext));
        }

        #region This part just copies the depth texture into a non-depth-texture so other passes that use shader graph stuff can use it. I'm probably going to remove this part entirely in the future, though.
        OverlayDepthBuffer buffer = frameData.GetOrCreate<OverlayDepthBuffer>();
        RenderTextureDescriptor outputDesc = new(cameraData.cameraTargetDescriptor.width, cameraData.cameraTargetDescriptor.height, RenderTextureFormat.RFloat, 0, cameraData.cameraTargetDescriptor.mipCount, RenderTextureReadWrite.Default);
        buffer.buffer = UniversalRenderer.CreateRenderGraphTexture(renderGraph, outputDesc, "Overlay Depth", false);

        using (var builder = renderGraph.AddRasterRenderPass<CopyPassData>("Copy Overlay Depth", out var passData, profilingSampler))
        {
            passData.mat = copyDepthMat;
            passData.depth = depthTex;

            builder.UseTexture(depthTex);
            builder.SetRenderAttachment(buffer.buffer, 0);

            builder.SetRenderFunc((CopyPassData data, RasterGraphContext rgContext) => ExecutePass(data, rgContext));
        }
        #endregion

        //Finally, this part uses a shader with zwrite on, ztest lequal, and that has a fragment that looks like this:
        //float4 frag(const v2f i, out float depth : SV_Depth) : SV_Target
        //{
        //    depth = tex2D(_SecondaryDepth, i.uv).r;
        //    return tex2D(_SecondaryDepthNormals, i.uv);
        //}
        //It uses that to copy the depth and normals back into the proper camera textures.
        using (var builder = renderGraph.AddRasterRenderPass<TransferNormalsPassData>("Transfer Depth Normals", out var passData, profilingSampler))
        {
            passData.mat = transferDepthNormalsMat;
            passData.normals = normalsTex;
            passData.depth = depthTex;

            builder.UseTexture(normalsTex);
            builder.UseTexture(depthTex);

            builder.SetRenderAttachment(resourcesData.cameraNormalsTexture, 0);
            builder.SetRenderAttachmentDepth(resourcesData.activeDepthTexture);

            builder.SetRenderFunc((TransferNormalsPassData data, RasterGraphContext rgContext) => ExecutePass(data, rgContext));
        }
    }
    static void ExecutePass(RenderNormalsPassData data, RasterGraphContext context)
    {
        context.cmd.DrawRendererList(data.rendererListHandle);
    }
    static void ExecutePass(CopyPassData data, RasterGraphContext context)
    {
        data.mat.SetTexture("_Depth", data.depth);
        context.cmd.DrawProcedural(Matrix4x4.identity, data.mat, 0, MeshTopology.Triangles, 3, 1);
    }
    static void ExecutePass(TransferNormalsPassData data, RasterGraphContext context)
    {
        data.mat.SetTexture("_SecondaryDepthNormals", data.normals);
        data.mat.SetTexture("_SecondaryDepth", data.depth);
        context.cmd.DrawProcedural(Matrix4x4.identity, data.mat, 0, MeshTopology.Triangles, 3, 1);
    }
    public void Dispose()
    {
        CoreUtils.Destroy(copyDepthMat);
        CoreUtils.Destroy(transferDepthNormalsMat);
    }
    protected class RenderNormalsPassData
    {
        internal RendererListHandle rendererListHandle;
    }
    private class CopyPassData
    {
        internal Material mat;
        internal TextureHandle depth;
    }
    private class TransferNormalsPassData
    {
        internal Material mat;
        internal TextureHandle normals;
        internal TextureHandle depth;
    }
}

So, here are the main issues:

  1. Why can’t I enable MSAA for these textures?
  2. Why can’t I create a texture in the R8G8B8A8_SNorm format like the depth normals map uses? I can’t find an equivalent within RenderTextureFormat, and don’t know how to make textures using GraphicsFormat.
  3. To be honest, I would’ve preferred not needing an intermediate depth texture at all, just rendering directly into the depth texture first, copying it there, and then allowing the rest of the depth normals prepass to occur like normal. However, RenderFeatureEvent does not give you that level of granularity when it comes to sequencing. Are these restrictions for good practise/parallelism/GPU reasons, or are they just arbitrary?

Thank you very much for reading all this, any and all help is appreciated.

I won’t be able to look into the specifics but some general remarks. A render texture with MSAA has a separate resolve resource under the hood (unless you set bindMS to true). When you set an MSAA RT as a texture, it samples the resolved resource as a non-msaa resource.

You should definitely learn to use GraphicsFormat, it’s the future and the future is now.

We landed a large amount of bugfixes to URP rendergraph since that patch release, it’s best to stay on the latest (22f1) to avoid running into these issues. Many are MSAA, or depth format related.

1 Like

Came here just to say I figured out that RenderTextureDescriptor has both colorFormat and graphicsFormat, which is a fun discovery. Unfortunately, that wasn’t preventing the textures from resolving back to R8G8B8A8_SRGB every time I tried to enable MSAA.
I didn’t know that this was due to the older versions. I’ll try updating the project to 21f1 now since that’s the one I have installed on my machine, but I’ll try 22f1 if that doesn’t resolve the issue.
Thanks for the reply!

indeed. You should avoid using colorFormat. the RT descriptor stores the graphicsformat and the colorFormat property is a legacy api to set the graphicsformat (color), depthStencilFormat, and shadowSamplingMode at the same time. It’s more expensive though and a bit magic.

1 Like

Updating to 21f1 somehow made it worse. It just started complaining that there was a mismatch in fragment dimensions, which wasn’t the case before. I’m downloading 22f1 but I’m not super convinced that’ll fix the issue.

Yup. Stopped working in both 21f1 and 22f1. No clue what possibly couldve changed between versions so that the dimensions of the textures are literally different, because I didn’t alter the project whatsoever…

I don’t know how to profile this kind of error. “Mismatch in Fragment dimensions” is super vague and gives me no information about where the issue lies, the frame debugger is empty, I don’t see how this kind of issue is meant to be addressed without just trial and error

Edit: editing so the thread doesn’t get bumped, but I figured out the issue.
UniversalResourceData.activeDepthTexture is not the same as the cameraDepthTexture. This wasn’t clear to me based on the documentation, but that may be because I’m still learning how the rendering pipeline works in Unity.
My code works again when I change activeDepthTexture to cameraDepthTexture in the final “Transfer Depth Normals” pass.

I’m not marking this post as solved though, because my original topic is still yet to be resolved.

the latest patches add more error checking, so it might just be that you are now warned of an existing issue.

Hi,

The MSAA in DX11 and DX12 does not work in RenderGraph, please fix it asap

It works in Vulkan or Compatibility mode directly though, so is a renderGraph bug

This is extreme major issue and should be the top priority

Did you file a bugreport?

Not yet as i see it in my main project and need to clean it up first.

Hi, i added the bug report

New Incident created: IN-105312 - MSAA in RenderGraph makes some effects not appear in Unity 6000.0.25

Thanks

thanks!

Out of curiosity, why do you continue to use 6.0.25? I know you validated the issue on 6.1 so it’s likely also on 6.0.50. That’s not my question. 25f is very old and many many bugs have been fixed in the 6.0 stream since.

1 Like

There is two reasons. One is because i am still not sure if i upload with 6000.0.50, what users in store will get if they use a lower version like 6000.0.26, e.g will get the 6000.0.50 or the older 2022.3 version ?

The other is to avoid breaking some of my massively big projects before i know i have time to upgrade them all properly.