ComputeShader + RenderTexture{dimension = TextureDimension.Tex2DArray} not working

I had tested a simple case that ComputeShader + RenderTexture{dimension = TextureDimension.Tex2DArray} , In UnityEditor it works well, but it not working on device.

using UnityEngine;
using UnityEngine.Experimental.Rendering;
using UnityEngine.Rendering;

public class MyRenderTextureArrayCompute : MonoBehaviour
{
    [SerializeField]
    private ComputeShader _MyRenderTextureArrayComputeShader;
    private CommandBuffer _cb;
    private RenderTexture _renderTexture;

    private readonly int Slices = 2;
    private readonly int Resolution = 256;

    // Start is called once before the first execution of Update after the MonoBehaviour is created
    void Start()
    {
        _cb ??= new()
        {
            name = "MyRenderTextureArrayComputeCommandBuffer",
        };
        _renderTexture = _CreateRenderTextureArray();

        _SetSampleTextureArray();
    }

    // Update is called once per frame
    void LateUpdate()
    {
        _Cmd_Draw();

#if UNITY_VISIONOS
        Unity.PolySpatial.PolySpatialObjectUtils.MarkDirty(_renderTexture);
#endif
    }

    void _Cmd_Draw()
    {
        _cb.Clear();
        _cb.SetComputeTextureParam(_MyRenderTextureArrayComputeShader, 0, "_RW_Target", _renderTexture);
        _cb.DispatchCompute
                (
                    _MyRenderTextureArrayComputeShader,
                    0,
                    Resolution / 1,
                    Resolution / 1,
                    Slices
                );
        Graphics.ExecuteCommandBuffer(_cb);
    }

    private protected RenderTexture _CreateRenderTextureArray()
    {
        GraphicsFormat CompatibleTextureFormat =  GraphicsFormat.R16G16B16A16_SFloat;
        string _TextureName = "RenderTexture-MyRenderTextureArrayCompute";
        bool NeedToReadWriteTextureData = true;

        RenderTexture result = new(Resolution, Resolution, 0, CompatibleTextureFormat)
        {
            wrapMode = TextureWrapMode.Clamp,
            antiAliasing = 1,
            filterMode = FilterMode.Bilinear,
            anisoLevel = 0,
            useMipMap = false,
            name = _TextureName,
            dimension = TextureDimension.Tex2DArray,
            volumeDepth = Slices,
            enableRandomWrite = NeedToReadWriteTextureData,
        };
        result.Create();
        return result;
    }

    void _SetSampleTextureArray()
    {
#if UNITY_VISIONOS
        Unity.PolySpatial.PolySpatialShaderGlobals.SetTexture("_MyTexture2DArray", _renderTexture);
#else
        Shader.SetGlobalTexture("_MyTexture2DArray", _renderTexture);
#endif
    }
}


May I ask where I am using it wrong? Or is this a bug?

For convenience, I share a simple project.

Unity: 6000.0.39f1
Polyspatial: 2.1.2
visionOS: 2.4

It’s a limitation of how we handle RenderTextures at the moment. We only support 2D RenderTextures, because we use RealityKit’s DrawableQueue API, which only supports 2D textures.

In our next release, we will be adding support for transferring textures via RealityKit’s LowLevelTexture API, which can be used to transfer 2D, 3D, cube map, and 2D array textures via GPU blit. LowLevelTexture is faster than a CPU transfer, but not as fast for 2D textures as DrawableQueue (we surmise because DrawableQueue uses something like a triple-buffered approach, so exchanging higher memory use for greater speed), so we still use DrawableQueue for RenderTextures (but LowLevelTexture for anything else that can be transferred on the GPU). We still expect RenderTextures to be 2D, but that’s an oversight; I’ll change it for the subsequent release so that we transfer non-2D RenderTextures via LowLevelTexture.

So, in summary:

  • In the next release, you will be able to use Texture2DArray to efficiently transfer 2D array textures via GPU blit.
  • In the release after that (probably), you will be able to use non-2D RenderTextures (which will transfer through the same method as Texture2DArrays).
1 Like

Thank you for such a detailed reply, I will continue to wait for the next release to test.

1 Like

As an update: the latest version, 2.2.4, should address this issue. You should be able to transfer a Texture2DArray via GPU blit either as a RenderTexture or a Texture2DArray.

Thanks for your reminder. After my test with version 2.2.4, it works well on the device with build. But it not working on the simulator with play to device.( RenderTexture{dimension = TextureDimension.Tex2DArray})

I can’t test Vision Pro with play to device, because testflight only has version 2.1.2, no 2.2.4.

I had update the sample project to version 2.2.4

Good to know; thanks! I’ll take a look at your sample project.

Ah, I see what you mean. We’ll look into that!

Hi GameFinder. There was a snafu with the Apple review for the latest TestFlight update to 2.2.4 to run on device. I’ve resolved Apples request for the TestFlight submission and now we just need to wait for them to approve it. Then you should be able to test on device again.

OK, I see what you mean. Your example works in a standalone build for device or simulator, but it doesn’t work with Play to Device (either for simulator or on device). That’s basically because, for Play to Device, we extract the texture data for transfer–and that code still assumes that the RenderTexture is 2D. I will make a note to fix that in a future version.

Thanks for your help!

1 Like

Hi, thanks! I can test on device with 2.2.4 now.

2 Likes