RenderTexture - Mip levels not send to RealityKit?

In our project, it looks like mip levels are not send to RealityKit when the RenderTexture is marked dirty, only mip level 0 is transfered. Is that the expected behaviour, or am I doing something wrong?

We generate a 128x128 rendertexture with 4 mip-levels, which are updated manually every frame. Level 0 is rendered with a camera, levels 1-3 are generated with Graphics.Blit every frame. Levels 1-3 are generated by reading the mip level above and blurring it.

I’m assuming you have “Enable Mip Maps”/useMipMap set to true and autoGenerateMips set to false. It looks like we currently initialize the DrawableQueue with .allocateAndGenerateAll for mip maps if useMipMap is true, where we should be using .allocateAll if autoGenerateMips is false (or just let the Unity side generate the mipmaps). I’ll make that change.

1 Like

Yes, this is the setup for creating the RenderTexture:

            int resolution= 128;
            int mipCount = 4;

            RenderTextureDescriptor desc = new RenderTextureDescriptor(resolution, resolution, RenderTextureFormat.ARGB32, 24, mipCount);
            desc.sRGB = true;
            desc.mipCount = mipCount;
            desc.useMipMap = mipCount > 1;
            desc.autoGenerateMips = false;

            renderTex = new RenderTexture(desc);

All mip levels > 0 show up as black in RealityKit, regardless of what I Blit into them … They look correct in editor, but hard to say if they are even correct on the Metal side on VisionOS.

Using autoGenerateMips = true works as a temporary fix. It’s slightly lower quality than the custom blur, but at least closer to OK.

1 Like

This actually seems to work for me at present, though–even with .allocateAndGenerateAll (which doesn’t seem to have an effect–if I don’t set the mip levels in Unity explicitly or via autoGenerateMipmaps, they end up black). Maybe it has something to do with how you’re rendering to the mip levels? Here’s the code I used to test:

using UnityEngine;

public class RenderTextureTest : MonoBehaviour
{
    // Start is called before the first frame update
    void Start()
    {
        int resolution = 128;
        int mipCount = 4;

        RenderTextureDescriptor desc = new(resolution, resolution, RenderTextureFormat.ARGB32, 24, mipCount);
        desc.sRGB = true;
        desc.mipCount = mipCount;
        desc.useMipMap = mipCount > 1;
        desc.autoGenerateMips = false;

        var renderTex = new RenderTexture(desc);

        var previousRenderTarget = RenderTexture.active;

        Graphics.SetRenderTarget(renderTex, 0);
        GL.Clear(true, true, Color.red);

        Graphics.SetRenderTarget(renderTex, 1);
        GL.Clear(true, true, Color.green);

        Graphics.SetRenderTarget(renderTex, 2);
        GL.Clear(true, true, Color.blue);

        Graphics.SetRenderTarget(renderTex, 3);
        GL.Clear(true, true, Color.yellow);

        RenderTexture.active = previousRenderTarget;

        GetComponent<MeshRenderer>().material.mainTexture = renderTex;
    }
}

Ok, I’ll check further tomorrow. Does it work correctly if you update the texture every frame with random colors?

Provided I call Unity.PolySpatial.PolySpatialObjectUtils.MarkDirty(renderTexture) on every frame, yes.

1 Like

My mistake. ZTest Always was not set on the shader used for blitting, so nothing was written to it on device. Sorry for the trouble!

1 Like