ExternalTexture not work in Mixed Reality mode

I discribe how i use first:
I use AVPlayer to generate PixelBuffer every frame, and Create Native Texture use CVMetalTextureCacheCreateTextureFromImage API from that, then send it to Unity to CreateExternalTexture and use it to update the Material. But i met error below:

[Diagnostics] EXCEPTION ArgumentException in PolySpatialCore:
at UnityEngine.Texture2D.GetPixels32 (System.Int32 miplevel) [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.Texture2D.GetPixels32 () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.ConversionHelpers.ToPolySpatialFallbackTextureData (UnityEngine.Texture2D tex2d, System.Action2[T1,T2] postConversionCallback) [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.LocalAssetManager.SendTextureAssetChanged (Unity.PolySpatial.Internals.PolySpatialAssetID assetID, UnityEngine.Object unityTexture, System.Boolean allowNativeTextures) [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChangedAsset (Unity.PolySpatial.Internals.AssetRepresentation representation) [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChanges () [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.PolySpatialUnitySimulation.Update () [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.PolySpatialCore.PolySpatialAfterLateUpdate () [0x00000] in <00000000000000000000000000000000>:0 at UnityEngine.LowLevel.PlayerLoopSystem+UpdateFunction.Invoke () [0x00000] in <00000000000000000000000000000000>:0 ArgumentException: Texture2D.GetPixels32: not allowed on native textures. (Texture 'ExternalTexture-Metal-4385469328-4320-2160') at UnityEngine.Texture2D.GetPixels32 (System.Int32 miplevel) [0x00000] in <00000000000000000000000000000000>:0 at UnityEngine.Texture2D.GetPixels32 () [0x00000] in <00000000000000000000000000000000>:0 at Unity.PolySpatial.Internals.ConversionHelpers.ToPolySpatialFallbackTextureData (UnityEngine.Texture2D tex2d, System.Action2[T1,T2] postConversionCallback) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.SendTextureAssetChanged (Unity.PolySpatial.Internals.PolySpatialAssetID assetID, UnityEngine.Object unityTexture, System.Boolean allowNativeTextures) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChangedAsset (Unity.PolySpatial.Internals.AssetRepresentation representation) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChanges () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.PolySpatialUnitySimulation.Update () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.PolySpatialCore.PolySpatialAfterLateUpdate () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.LowLevel.PlayerLoopSystem+UpdateFunction.Invoke () [0x00000] in <00000000000000000000000000000000>:0
UnityEngine.DebugLogHandler:Internal_LogException(Exception, Object)
UnityEngine.DebugLogHandler:LogException(Exception, Object)
UnityEngine.Debug:LogErrorFormat(Object, String, Object)
UnityEngine.Logger:LogException(Exception, Object)
UnityEngine.Debug:LogErrorFormat(Object, String, Object)
UnityEngine.Debug:LogException(Exception)
Unity.PolySpatial.Internals.PolySpatialCore:PolySpatialAfterLateUpdate()
UnityEngine.LowLevel.UpdateFunction:Invoke()

ArgumentException: Texture2D.GetPixels32: not allowed on native textures. (Texture ‘ExternalTexture-Metal-4385469328-4320-2160’)
at UnityEngine.Texture2D.GetPixels32 (System.Int32 miplevel) [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.Texture2D.GetPixels32 () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.ConversionHelpers.ToPolySpatialFallbackTextureData (UnityEngine.Texture2D tex2d, System.Action`2[T1,T2] postConversionCallback) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.SendTextureAssetChanged (Unity.PolySpatial.Internals.PolySpatialAssetID assetID, UnityEngine.Object unityTexture, System.Boolean allowNativeTextures) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChangedAsset (Unity.PolySpatial.Internals.AssetRepresentation representation) [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.LocalAssetManager.ProcessChanges () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.PolySpatialUnitySimulation.Update () [0x00000] in <00000000000000000000000000000000>:0
at Unity.PolySpatial.Internals.PolySpatialCore.PolySpatialAfterLateUpdate () [0x00000] in <00000000000000000000000000000000>:0
at UnityEngine.LowLevel.PlayerLoopSystem+UpdateFunction.Invoke () [0x00000] in <00000000000000000000000000000000>:0

Object-C code like this:

  • (id)getTextureForCurrentFrame {
    CMTime itemTime = [_videoOutput itemTimeForHostTime:CACurrentMediaTime()];
    if ([_videoOutput hasNewPixelBufferForItemTime:itemTime]) {
    CVPixelBufferRef pixelBuffer = [_videoOutput copyPixelBufferForItemTime:itemTime itemTimeForDisplay:nil];
    if (pixelBuffer) {
    _width = (int)CVPixelBufferGetWidth(pixelBuffer);
    _height = (int)CVPixelBufferGetHeight(pixelBuffer);

// OSType pixelFormatType = CVPixelBufferGetPixelFormatType(pixelBuffer);

        CVMetalTextureRef metalTextureRef = NULL;
        CVReturn status = CVMetalTextureCacheCreateTextureFromImage(kCFAllocatorDefault, _textureCache, pixelBuffer, NULL, MTLPixelFormatRGBA8Unorm, _width, _height, 0, &metalTextureRef);
        CVPixelBufferRelease(pixelBuffer);

        if(status == kCVReturnSuccess) {
            id<MTLTexture> metalTexture = CVMetalTextureGetTexture(metalTextureRef);
            CFRelease(metalTextureRef);
            externalTexUpdateCallback(_width, _height, (__bridge void *)(metalTexture));
            return metalTexture;
        }
    }
}
return nil;

}

Unity Code like this:

               _instance._externalTexture = Texture2D.CreateExternalTexture(width, height, TextureFormat.RGBA32, false, false, metalTexture);
                _instance._externalTexture.name = $"ExternalTexture-Metal-{metalTexture}-{width}-{height}";
                // Graphics.CopyTexture(_instance._externalTexture, _instance._unityTexture);
                
                Debug.Log($"Unity: CreateExternalTexture and Set tex: {_instance._externalTexture}");
                if ( _instance._matPlayer)
                {
                    Debug.Log("Unity: material set texture");
                    _instance._matPlayer.SetTexture("_mainTexture", _instance._externalTexture ); 
                }

Is external texture supported?

It’s not supported directly, the way you’re trying to use it. We are limited by the restrictions of RealityKit’s TextureResource API, which doesn’t allow using MTLTexture objects directly. For RenderTextures, however, we use the DrawableQueue API to obtain an MTLTexture from RealityKit, to which we copy the contents of the RenderTexture (using a GPU blit). What you may be able to do is to copy the contents of the texture created with CreateExternalTexture to a RenderTexture (using a GPU operation like Graphics.CopyTexture or Graphics.Blit), then use that RenderTexture in a material. You will have to explicitly mark the RenderTexture as dirty every time you update it using Unity.PolySpatial.PolySpatialObjectUtils.MarkDirty(renderTexture).

1 Like

OK, thanks, I tried it , RenderTexture works fine. As I understand, also can use new Texture2D()? Does it make some difference in performance?

If you mean passing the IntPtr nativeTex to the Texture2D constructor, then no, there shouldn’t be a difference.

Sorry,I mean CreateExternalTexture from MTLTexture ptr from OC, then use CopyTexture to a Texture2D . Does it have difference with CopyTexture to a RenderTexture, thanks.

I’d be surprised if that worked at all, since we need the data in Texture2Ds to be CPU-readable (versus RenderTextures, where we do a GPU blit). Even if it does work, a RenderTexture will definitely perform better.

1 Like

Got it , thank you very much!

Hello, I think this problem again. Can i do this way:because i have MTLTexture,i use RealityKit Composer Pro build my scene,and use ShaderGraphMaterial,And Use DrawableQueue API to update ShaderGraphMaterial the same as Unity does。If this OK,could you please provide more details about how to do this with RealityKit。I notice that ShaderGraphMaterial just have setParameter API,can‘t find examples about that。

You mean how to do it outside of Unity? You first have to create a TextureResource of any kind (the easiest way is probably to use TextureResource.generate with a one-pixel image), then replace it with a DrawableQueue using TextureResource.replace. You can set that texture in a ShaderGraphMaterial using setParameter with a textureResource parameter. To update the texture, you use DrawableQueue.nextDrawable to get a Drawable you can write to, which has a texture property with the actual texture you need to modify (e.g., copy to). To do that modification, you might for instance (as we do) create an MTLCommandBuffer and, from that, create an MTLBlitCommandEncoder. Add a copy command to that to copy the texture contents from source to destination, then endEncoding, commit, waitUntilCompleted, and present.

1 Like

Thank you very much, i will try it!

1 Like

Hello, i am back again. I tried it follow your method, but when i call DrawableQueue.nextDrawable, it throw a error:

RealityKit.TextureResource.DrawableQueue.(unknown context at $1c5e01124).NextDrawableError.timeoutReached

code like this

    public lazy var mtlDevice: MTLDevice = {
        guard let device = MTLCreateSystemDefaultDevice() else {
            fatalError()
        }
        return device
    }()

var drawableQueue: TextureResource.DrawableQueue?

    private lazy var commandQueue: MTLCommandQueue? = {
        return mtlDevice.makeCommandQueue()
    }()

func createDrawableQueue(width: Int, height: Int) -> TextureResource.DrawableQueue {
        if drawableQueue != nil {
            return drawableQueue!
        }
        let descriptor = TextureResource.DrawableQueue.Descriptor(
            pixelFormat: .bgra8Unorm,
            width: width,
            height: height,
            usage: [.shaderRead, .shaderWrite, .renderTarget],
            mipmapsMode: .none
        )

        do {
            let queue = try TextureResource.DrawableQueue(descriptor)
            queue.allowsNextDrawableTimeout = false
            
            // drawableQueue relace TextureResource
            textureResource.replace(withDrawables: queue)
      
            try? self.shaderGraphMaterial.setParameter(name: textureName, value: MaterialParameters.Value.textureResource(textureResource))
            
            drawableQueue = queue
            return queue
        } catch {
            fatalError("Could not create DrawableQueue: \(error)")
        }
    }


func update(with texture: MTLTexture) {
      guard let drawableQueue = drawableQueue,
            let commandBuffer = commandQueue?.makeCommandBuffer() else {
          return
      }

      do {
          let drawable = try drawableQueue.nextDrawable()
          guard let blitCommandEncoder = commandBuffer.makeBlitCommandEncoder() else {
            fatalError("Could not create a blit command encoder")
          }

          blitCommandEncoder.copy(from: texture, to: drawable.texture)
          blitCommandEncoder.endEncoding()
          commandBuffer.commit()
          commandBuffer.waitUntilCompleted()
          drawable.present()

      } catch {
          print("Error while getting the drawable: \(error.localizedDescription)")
      }

what is the problem, thank you , hope your reply~

You may simply be calling DrawableQueue.nextDrawable more often than it can handle. It only has a limited number of buffers, so if you call it too frequently, it will run out and either block until one is available or throw this timeoutReached exception. We currently handle this by just catching the exception and, if it was thrown, trying again on the next update.

Sorry to revive this thread, but I am wondering if you might be open to sharing a larger example of how you’re implementing the DrawableQueue? I am trying to attempt something similar (in which the CVPixelBuffer frames that are being fed from an in-progress video, and those are applied to a TextureResource for rendering on a material), and your code has been valuable in trying to understand how to create the DrawableQueue. However, I’m struggling to figure out how to implement the DrawableQueue into a TextureResource in the RealityKit/SwiftUI world? More specifically, I’m curious how the code you’ve displayed here is actually wired up into RealityKit. Thanks!

I’m not the person you’re replying to, but the call to TextureResource.replace(withDrawables:) is how you make a TextureResource use a DrawableQueue. The initial TextureResource can be anything, AFAIK; we just use a 1x1 pixel image created with TextureResource.generate.

Thanks, @kapolka! I was indeed trying to respond to @bYsdTd, who shared a great example of this code, but I just haven’t figured out how to wire it up in full yet. Your detail here is super helpful, but I think my struggle is that I’m not clear how to call func update() and actually get the UI to update on every frame. Conceptually, I think I get how to feed the TextureResource.DrawableQueue into the TextureResource’s .replace method, but I don’t get what causes anything to be called on every frame to update.

You may be looking for CADisplayLink, then. That’s how Unity (and other applications) receive a callback per-frame that drives the update loop.

Thanks, @kapolka! For my setup, that’s effectively what I have (I have an AVAsset loaded up → AVPlayer → CADisplayLink → AVPlayerItemVideoOutput - this gives me the CVPixelBuffer of each frame). Presumably, I’ll then need to convert the CVPixelBuffer to a MTLBuffer, then feed that into the update(texture:) method demonstrated here, which should wire this up. I’ll give it a try and report back. Thanks!

1 Like

Following up that I was able to follow your guidance and wire all of this up. Regretfully, I’m now hitting the same error that @bYsdTd hit, related to Error while getting the drawable: Timeout reached while waiting for next drawable (which seems to occur at every refresh, and never seems to render anything). More to investigate for me, but if you have any thoughts further, I’d be intrigued! Thanks again!

Right; we get this as well, but we just catch it and try again on the next frame. Eventually the queue clears up and it provides another drawable.

I pushed my demo code to github, you can reference it, may help.
Unity Demo: GitHub - bYsdTd/UnityVisionOSPlayer
Outside Unity with RealityKit and RealityComposerPro: GitHub - bYsdTd/VRPlayerRealityKit

1 Like