Is there a way to sum all the pixels of a texture together in a shader?

I’m doing a post process effect that needs to sum the pixel values together. For example, I have a pass that gets the depth values and renders to a RenderTexture. If I want to sum those values together and find the average depth, how could I go about doing that?

I tried making a separate (1x1) RenderTexture and adding the values from the DepthTexture to it, but no luck. I also tried downsampling with Graphics.Blit(), but from what I understand it uses bilinear interpolation to get a weighted average, which is not what I need.

Any help would be appreciated!

I’ve had a similar issue with a game jam I made, where I wanted a spotlight to have the average color of a TV. While my solution isn’t exactly what you ask for (It executes on the CPU), it does what you are asking in a timely manner (Albeit it is still very intensive. Datas goes from CPU to GPU real fast, the other way around is another matter entirely …).

using System.Threading.Tasks;
using Unity.Collections;
using UnityEngine;
using UnityEngine.Rendering;

public class TVSpotlight : MonoBehaviour
{
    public RenderTexture renderTexture = null;

    public List<Light> lights = null;

    private Queue<AsyncGPUReadbackRequest> requests = new Queue<AsyncGPUReadbackRequest>();

    private Color32[] color32Buffer;

    private void Awake()
    {
        color32Buffer = new Color32[renderTexture.width * renderTexture.height];
    }

    private void LateUpdate()
    {
        if (requests.Count < 8)
        {
            requests.Enqueue(AsyncGPUReadback.Request(renderTexture, 0, (AsyncGPUReadbackRequest req) =>
            {
                if (req.hasError)
                {
                    Debug.Log("GPU readback error detected.");
                    requests.Dequeue();
                    return;
                }
                else if (req.done)
                {
                    req.GetData<Color32>().CopyTo(color32Buffer);

                    int averageR = 0;
                    int averageG = 0;
                    int averageB = 0;
                    int count = 0;

                    for (int i = 0; i < color32Buffer.Length; ++i)
                    {
                        averageR += color32Buffer*.r;*

averageG += color32Buffer*.g;*
averageB += color32Buffer*.b;*
++count;
}

averageR /= count;
averageG /= count;
averageB /= count;

lights.ForEach(x => x.color = new Color32((byte)averageR, (byte)averageG, (byte)averageB, 255));
}

requests.Dequeue();
}));
}
}
}

I believe it can be adapted to your use case (Setting a render texture instead of the light color).
This is the kind of render I was able to get using this :
[131955-capturereafdxas.jpg|131955]
It is also worth noting that the ASyncGPUReadbackRequest has a very slight lag, but small enough to be nearly invisible.
I hope it will be able to help you !

I ended up blitting to a lower resolution texture and reading from the CPU on those pixels. The AsyncGPU solution Fluffy proposed has some hiccups but I think it’s something I can work with as well.