AfterOpaqueDepthAndNormal write into normal and depth buffers

Hello,
I’ve created a custom volume with fullscreen pass to write to normal & depth buffers, I use the default :
float4 FullScreenPass(Varyings varyings) : SV_Target
but it doesn’t write anything to screen (if i switch to later step it write things),
I can’t find a sample on how to write normal & depth (only an extract sample), I have tried target 1,2,etc… doesnt work.
Please help.

Ok SV_Depth did it for writing depth, but how do I write normal in another pass ?

So I have created a custom fullscreenpass :

using UnityEngine;
using UnityEngine.Rendering.HighDefinition;
using UnityEngine.Rendering;
using UnityEngine.Experimental.Rendering;
using UnityEngine.Profiling;
using System.Collections.Generic;
using System.Collections;

class FinalPassGBuffer : CustomPass
{
    public Material Mat;
    public string MatPass;

    private Mesh _Quad;

    protected override void Setup(ScriptableRenderContext renderContext, CommandBuffer cmd)
    {
        _Quad = new Mesh();
        _Quad.SetVertices(new List< Vector3 >{
            new Vector3(-1, -1, 0),
            new Vector3( 1, -1, 0),
            new Vector3(-1,  1, 0),
            new Vector3( 1,  1, 0),
        });
        _Quad.SetTriangles(new List<int>{
            0, 3, 1, 0, 2, 3
        }, 0);
        _Quad.RecalculateBounds();
        _Quad.UploadMeshData(false);
    }

    protected override void Execute(ScriptableRenderContext renderContext, CommandBuffer cmd, HDCamera hdCamera, CullingResults cullingResult)
    {
        int pass = Mat.FindPass(MatPass);
        if (pass == -1)
            return;

        float ForwardDistance = hdCamera.camera.nearClipPlane + 0.0001f;
        var trs = Matrix4x4.TRS(
            hdCamera.camera.transform.position,
            hdCamera.camera.transform.rotation,
            Vector3.one);
 
        renderContext.SetupCameraProperties(hdCamera.camera);
        cmd.SetRenderTarget( HELP );
        cmd.DrawMesh(_Quad, trs, Mat, 0, pass);
    }

    protected override void Cleanup()
    {
        CoreUtils.Destroy(_Quad);
    }
}

But I dont know where is the normals/roughness buffer, where can I grab/bind it when using AfterOpaqueDepthAndNormal ?

Im using Unity 2020.1.6f1

Hello,

You can take a look at this example on how to render an object in the normal buffer:

This example shows how to render objects with the built-in Depth-Prepass of HDRP. If you want to do something else like change the roughness or read the normals in the buffer, you’ll have to write a custom shader.

You can also take a look at this shader:

It does an edge detect effect using the normal buffer

Hello,

I’ve done what I wished using other ways (I compute myself reflection & lighting).

Yes I’ve already seen this sample, what he does is creating a fake mesh which is then “Z distorted” using a shader graph, like this he has access to the full pipeline computation.
I just wished there were a more friendly approach allowing me to write into the GBuffer, without shader graph / tricks.

Regards.

between its kind of same effect I’m doing, but using gpu particles… here is a screenshot, cant send video for now, it will be released next week-end. i render “before rendering” as opaque, this way I’ve ambient occlusion & particles reflected in scene.

i will post video tomorrow. I have big issue with the particles timestep, the simulation is faster on high-end computers, frame rate dependant. There is already a post on forum somewhere about it. Please add a mode where its not fps dependant. I could try a hack for final demo version, to get current fps and act according but…

same particles from beginning to end, no cheating, but runs at different speed depending of fps…

Hi, sorry but could you open a separate forum thread for this issue, it will simply be lost in this topics. thanks

Hi, ok I created one here : https://forum.unity.com/threads/vfx-particles-timestep-physics-simulation.983923/
Thanks.