I create a texture in D3D11 on the rendering thread with the following code. (Verified the creation works and the result texture pointer is non-null, function returning S_OK.)
D3D11_TEXTURE2D_DESC TextureDesc ;
TextureDesc . Width = Target -> Width ;
TextureDesc . Height = Target -> Height;
TextureDesc . MipLevels = 1 ;
TextureDesc . ArraySize = 1 ;
TextureDesc . Format = DXGI_FORMAT_B8G8R8A8_UNORM ;
TextureDesc . SampleDesc . Count = 1 ;
TextureDesc . SampleDesc . Quality = 0 ;
TextureDesc . Usage = D3D11_USAGE_DEFAULT ;
TextureDesc . BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET ;
TextureDesc . CPUAccessFlags = 0 ;
TextureDesc . MiscFlags = D3D11_RESOURCE_MISC_GDI_COMPATIBLE ;
ID3D11Device * Device = UnityRenderDevice ( ) ;
if ( ! Device )
{
Log ( "Can't create texture: Device not available." ) ;
return ;
}
if ( Device -> CreateTexture2D ( & TextureDesc , nullptr , & Target -> Texture ) != S_OK )
{
Log ( "Can't create texture: CreateTexture2D error." ) ;
return ;
}
if ( Target -> Texture -> QueryInterface < IDXGISurface1 > ( & Target -> Surface ) != S_OK )
{
Log ( "Can't create texture: QueryInterface < IDXGISurface1 > error." ) ;
Target -> Texture -> Release ( ) ;
Target -> Texture = nullptr ;
}
Once I have synchronized with the main thread in C++ and gotten the texture IntPtr back into Unity (verifying it is not IntPtr.Zero), I try to do this:
CurrTexture = Texture2D . CreateExternalTexture ( Width , Height , TextureFormat . BGRA32 , false , false , TextureObject ) ;
This completely crashes (not to the Unity crash screen: Straight to the OS with no error indication at all). I have absolutely no way to get some sort of error code out of this. I have verified with debug statements that every single value is non-null. If I create a normal dummy unity texture instead of trying to use CreateExternalTexture, all of the rest of my code does not crash (but obviously does not use the actual texture I need it to). I have also verified the width/height being supplied are correct: I tried a static 128x128 instead of my supplied values, and it still crashes.
How can I debug this? What could be causing CreateExternalTexture to crash without Unity even able to catch the exception?