[Solved] Texture pointer with DirectX

Hello !

I’m here to ask you some help because i’m a bit lost for the moment :slight_smile:

Here is my situation :
I have developed a native plugin in c/cpp that decode video using FFMPEG and SDL.
For each frame, i have an AVFrame (struct that contains information of each frame of a video) that i bind using opengl. the data of a frame and the pointer of a texture obtained from unity using the function GetNativeTexturePtr.

this is my actual situation, which is working really fine !

In this case i have to use the opengl graphic API in the player settings in Unity.

I need to change my code in order to support DirectX 11, so that i will be able in integrate the Oculus SDK and i wont have obsolete opengl code in my native plugin.

to be clear : My function that does the binding between the texture pointer from unity and the data of each frame that i got from my code using ffmpeg need to be changed in order to be available in a DirectX grapic api context in unity.

How to set value of a texture pointer using directx code ?

Could you please guide me for this modification ?

I’m using unity 5.2.4 32 bits and Visual Studio.

Thank you for your help !

Here is the code i use for the rendering function :
(In my c/cpp code using opengl functions)

    void DoRendering() {

            // Opengl case
            if (s_DeviceType == kUnityGfxRendererOpenGL) {

                //init
                //g_TexturePointer is a void * that i get from unity , this is the pointer of the texture
                texture = (GLuint)(size_t)(g_TexturePointer);
                glBindTexture(GL_TEXTURE_2D, texture);

                //parameters
                glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
                glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
                glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
                glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);


                if (global_video_state && global_video_state->playerState != STOPPED) {
                    if (savedFrame && savedFrame->data[0] && isPictAvailable) {
                        GLsizei texWidth = videoW;
                        GLsizei texHeight = videoH;

                        if (isFirstUseOfGLText) {
                            //savedFrame->data is type of uint8_t *
                            glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, texWidth, texHeight, 0, GL_RGB, GL_UNSIGNED_BYTE, savedFrame->data[0]);
                            isFirstUseOfGLText = false;
                        }
                        else {
                            glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, texWidth, texHeight, GL_RGB, GL_UNSIGNED_BYTE, savedFrame->data[0]);
                        }

                        isPictAvailable = false;
                    }
                }
                else {
                //display black image of 2x2 pixels
                    float pixels[] = {
                        0.0f,0.0f,0.0f, 0.0f,0.0f,0.0f,
                        0.0f,0.0f,0.0f, 0.0f,0.0f,0.0f,
                    };
                    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, 2, 2, 0, GL_RGB, GL_FLOAT, pixels);
                }
                glFlush();
                glFinish();
            }


            // D3D11 case
            if (s_DeviceType == kUnityGfxRendererD3D11)
            {
                // update native texture from code
                if (g_TexturePointer)
                {
                    ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)g_TexturePointer;
                    D3D11_TEXTURE2D_DESC desc;
                    d3dtex->GetDesc(&desc);

                
                    //here i'm lost
                }
            }
    }

As you can see at line 57… There is the help i need :slight_smile: !

Thank you !

Hello again,

I also have the feeling that this function is never called, so its impossible for me to know the graphic context used by Unity at the time my plugin is used :

In my c/cpp code

void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API UnityPluginLoad(IUnityInterfaces* unityInterfaces)
//exact code from the example...

This code come from the example project available here :

In my Unity c#code, an exemple of function i use :

#if UNITY_5 && !UNITY_5_0 && !UNITY_5_1
    [DllImport("FFMPEGPlayerWIN")]
    public static extern void SetTimeFromUnity(float value);

...
//example
SetTimeFromUnity(1);

#endif

which leaves me with only this :

UnityGfxRenderer s_DeviceType = kUnityGfxRendererNull;

from IUnityGraphics.h
And this var stay to kUnityGfxRendererNull the whole time even if i change the API grpahics in the player setting to Directx or Opengl

Thx for reading

EDIT : I found the solution for this problem.

I used DLL Export Viewer ( DLL Export Viewer - view exported functions list in Windows DLL ) i order to see every call possible (exported functions) from my lib and i found out that there was an ‘_’ before the name of my functions caused by the “__stdcall” added in front of the function.
My bad then, i had to remove it to only let this :

void UNITY_INTERFACE_EXPORT UnityPluginLoad(IUnityInterfaces* unityInterfaces)

So now im able to switch context between opengl and Directx normally and my var s_DeviceType is correct now( its value is 0 for Desktop OpenGL and 2 for Direct3D 11.

Now i can continue my research about the Directx11 pointer context(first message)

Hello guys !

Now that my UnityPluginLoad works, i have a context !
which allows me to have an “IUnityGraphicsD3D11” and a “ID3D11Device” :slight_smile:

So i have found a solution for my problem.

I will explain my work here, may this post help someone in the future !
and thanks to the community for your help… oh wait ?

Lets begin :

1 - You need to have a UnityPluginLoad that works fine, so you can get the context of D3D11

IUnityGraphicsD3D11* d3d11 = s_UnityInterfaces->Get<IUnityGraphicsD3D11>();
            g_D3D11Device = d3d11->GetDevice();

2 - I use the format “RAW_BGRA32” for my texture in unity and of course “AV_PIX_FMT_BGRA” in my ffmpeg code for the AVFrame.

3 - I now have an updated code that take into account the D3D11.

void DoRendering() {

#if SUPPORT_D3D11
            // D3D11 case
            if (s_DeviceType == kUnityGfxRendererD3D11)
            {
                //get the ID3D11DeviceContext
                ID3D11DeviceContext* ctx = NULL;
                //g_D3D11Device is type of ID3D11Device
                g_D3D11Device->GetImmediateContext(&ctx);

                // update native texture from code
                if (g_TexturePointer)
                {
                    //get the pointer from the unity texture.
                    //g_TexturePointer is receive from unity via setTexturePtr (void *)
                    //cast it in a ID3D11Texture2D
                    ID3D11Texture2D* d3dtex = (ID3D11Texture2D*)g_TexturePointer;
                    d3dtex->GetDevice(&g_D3D11Device);

                    //D3D11_TEXTURE2D_DESC desc;
                    //d3dtex->GetDesc(&desc);

                    //use ID3D11DeviceContext::UpdateSubresource to fill the default texture with data from a pointer provided by the application.
                    ctx->UpdateSubresource(d3dtex, 0, NULL, savedFrame->data[0], savedFrame->linesize[0], 0);
                  
                }

            }//end of dx11 case
#endif

}

But with thoses changes my code for opengl is not working anymore, i have to change some color format, but at the moment i dont care because i only need Directx 11 !

I’ll test a bit more, if it’s all ok for my site, i will change the title of this topic as solved.
EDIT : Set as solved !

Thx for reading anyway, too bad none respond :frowning:

2 Likes

Three and a half years later, this thread saved my buns. Thank you so much.

I try to render the output of a Unity camera into a native window using DirectX. To achieve this in Unity first I copy the render texture of camera to a new texture in OnRenderImage event

            RenderTexture.active = source as RenderTexture;
            _wrappedTexture.ReadPixels (new Rect (0, 0, src.width, src.height), 0, 0);
            _wrappedTexture.Apply ();
            RenderTexture.active = null;

And Send _wrappedTexture.GetNativeTexturePtr to the plugin to render the frame on the native window. I created a new directX device & context for the native window. I am new to DirectX and have no much knowledge there.

Starting from the next update I use Graphics.Copy to make the operation fast:
, on next frame I use Graphics.Copy. Please see the code:

            Graphics.CopyTexture(source, _wrappedTexture);

I would like to achieve the best possible performance where I don’t have to copy the render texture. Sending RenderTexture.GetNativeTexturePtr directly to the plugin doesn’t work.