Hi there!
I am wondering does anyone know how to debug an invalid access unity crash result from createExternalTexture? I created a d3d11Texture2d texture in a native plugin and passed that ptr to unity. But it results a crash. The desc for creating the texture on native side is:
D3D11_TEXTURE2D_DESC desc;
desc.Width = 100;
desc.Height = 100;
desc.MipLevels = 1;
desc.ArraySize = 1;
desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
desc.SampleDesc.Count = 1;
desc.SampleDesc.Quality = 0;
desc.Usage = D3D11_USAGE_DEFAULT;
desc.BindFlags = D3D11_BIND_SHADER_RESOURCE | D3D11_BIND_RENDER_TARGET;
desc.CPUAccessFlags = 0;
desc.MiscFlags = D3D11_RESOURCE_MISC_SHARED;
And the function on unity script side is:
Texture2D tex = Texture2D.CreateExternalTexture(
100, 100, TextureFormat.RGBA32, true, false, ptr)
I am not sure how do I debug this. From reading the doc, it looks like it should work…
Thanks for your help in advance.
Sorry. It is access violation, not invalid access. I am using Unity 5.3.3f1
Sounds like a Segmentation Fault. Can we see the stack trace?
hah. Forget there is a .dmp file. Yes, the stack trace suggest that it failed on
CContext:
oFineGrainedSRVOutputHazardCheck
But I can’t step into the code to see what is going on.
I was suspecting that some of the parameters that I used to create the texture was wrong and it caused the crash. So I created a 2D texture in c# script and pass the raw ptr to my plugin. Here is what I get:
- Unity texture created with: new Texture2D(100, 100, TextureFormat.RGBA32, true, false)
- desc_unity {Width=100 Height=100 MipLevels=7 …} D3D11_TEXTURE2D_DESC
Width 100 unsigned int
Height 100 unsigned int
MipLevels 7 unsigned int
ArraySize 1 unsigned int
Format DXGI_FORMAT_R8G8B8A8_UNORM (28) DXGI_FORMAT
- SampleDesc {Count=1 Quality=0 } DXGI_SAMPLE_DESC
Usage D3D11_USAGE_DEFAULT (0) D3D11_USAGE
BindFlags 8 unsigned int
CPUAccessFlags 0 unsigned int
MiscFlags 0 unsigned int
- Unity texture created with: new Texture2D(100, 100, TextureFormat.RGBA32, false, false)
- desc_unity {Width=100 Height=100 MipLevels=1 …} D3D11_TEXTURE2D_DESC
Width 100 unsigned int
Height 100 unsigned int
MipLevels 1 unsigned int
ArraySize 1 unsigned int
Format DXGI_FORMAT_R8G8B8A8_UNORM (28) DXGI_FORMAT
- SampleDesc {Count=1 Quality=0 } DXGI_SAMPLE_DESC
Usage D3D11_USAGE_DEFAULT (0) D3D11_USAGE
BindFlags 8 unsigned int
CPUAccessFlags 0 unsigned int
MiscFlags 0 unsigned int
I tried to create my texture follow the same texture desc. But I still get the same unity crash. Has anyone seen this before? Is it a bug or am I doing something wrong?
FWIW, it looks like that I need to use the same d3d device and context in my plugin to interact with unity. I was doing D3D11CreateDevice in my plugin and the texture that generates from the new device and context doesnt work in plugin.
The problem might be:
desc.MiscFlags = D3D11_RESOURCE_MISC_SHARED;
See if this fixes the crash, although I don’t know how it will affect your plugin:
desc.MiscFlags = 0;