How Texture2D.CreateExternalTexture works with Direct3D11?

on Unity 2017.1.1 and 2017.1.2

Texture2D.CreateExternalTexture works well on OpenGL Mode, but it crashes if I use Direct3D11 Mode. Here is the crash stack:

Unity.exe!TexturesD3D11Base::TextureFromShaderResourceView(struct ID3D11ShaderResourceView *,struct ID3D11Texture2D * *) Unknown
Unity.exe!TexturesD3D11Base::RegisterNativeTexture(struct ID3D11ShaderResourceView *) Unknown
Unity.exe!GfxDeviceD3D11Base::RegisterNativeTexture(struct TextureID,__int64,enum TextureDimension) Unknown
Unity.exe!GfxDeviceWorker::RunCommand(class ThreadedStreamBuffer &) Unknown
Unity.exe!GfxDeviceWorker::Run(void) Unknown
Unity.exe!GfxDeviceWorker::RunGfxDeviceWorker(void *) Unknown
Unity.exe!Thread::RunThreadWrapper(void *) Unknown
kernel32.dll!00000000777e59cd() Unknown
ntdll.dll!0000000077a1a561() Unknown

I would like to know how to make it use with Direct3D11 Mode and if there are some examples I could follow ?

Code below is how I try to make CreateExternalTexture work with Direct3D11.

In my native plugin, I create a texture2D as this:

D3D11_TEXTURE2D_DESC textureDesc;
ZeroMemory(&textureDesc, sizeof(textureDesc));

textureDesc.Width = _width;
textureDesc.Height = _height;
textureDesc.BindFlags = D3D11_BIND_SHADER_RESOURCE;
textureDesc.MipLevels = textureDesc.ArraySize = 1;
textureDesc.Format = DXGI_FORMAT_R8G8B8A8_TYPELESS;
textureDesc.SampleDesc.Count = 1;
textureDesc.Usage = D3D11_USAGE_DEFAULT;
textureDesc.CPUAccessFlags = 0;
textureDesc.MiscFlags = 0;

auto hr = _d3d11Device->CreateTexture2D(&textureDesc, NULL, &_outputTexture);
assert(hr == S_OK);

return _outputTexture

Where _outputTexture is interpreted as void* and passed to Unity as IntPtr.
In debug mode, I could see the IntPtr I get from Unity seems OK (not a crazy or zero number).

Then In Unity I get the IntPtr and create:

// I get my _outputTexture
nativeTex = NativeD3D11PluginWrapper.GetOutputTexture((uint)Screen.width, (uint)Screen.height);
texture = Texture2D.CreateExternalTexture(Screen.width, Screen.height, format, false, false, nativeTex); <----- crash here (see the crash stack above).

If I do the similar thing in OpenGL, it doesn’t crash and it works. Of course you know in OpenGL mode, the IntPtr I get is a texture “name” not an address…

Thanks for any answers,


In doc it says: Native texture object on Direct3D-like devices is a pointer to the base type, from which a texture can be created (IDirect3DBaseTexture9 on D3D9, ID3D11ShaderResourceView on D3D11). On OpenGL/OpenGL ES it is GLuint. On Metal it is id.

Literally, ID3D11ShaderResourceView needs to be passed, not the texture itself.

The function crashed TexturesD3D11Base::TextureFromShaderResourceView makes sense.