Hey !
I’m attempting to build a C DLL that can be called in order to fill a texture with the main monitor’s contents, using GDI (yes, the program will run on Windows only).
The idea is to pass a pointer to a Unity Texture2D, set its underlying pixel color memory within the native code using the “captured” main monitor view in order to then use that Texture2D on a TV within the app that shows my / the user’s main monitor.
Here’s the native C code, located within a DLL dragged inside the Assets folder for integration.
#include <Windows.h>
#include <string>
struct PIXEL_DATA
{
unsigned int r;
unsigned int g;
unsigned int b;
};
extern "C" __declspec(dllexport) int DesktopCaptureToTexture(void* textureMem, int width, int height)
{
if (textureMem == NULL) return 1;
HDC displayDeviceContext = GetDC(NULL);
BITMAPINFO bitmapCreateInfo = {};
bitmapCreateInfo.bmiHeader = {};
bitmapCreateInfo.bmiHeader.biSize = sizeof(BITMAPINFOHEADER);
bitmapCreateInfo.bmiHeader.biWidth = width;
bitmapCreateInfo.bmiHeader.biHeight = height;
bitmapCreateInfo.bmiHeader.biPlanes = 1;
bitmapCreateInfo.bmiHeader.biBitCount = 24;
bitmapCreateInfo.bmiHeader.biCompression = BI_RGB;
bitmapCreateInfo.bmiHeader.biSizeImage = 0;
bitmapCreateInfo.bmiHeader.biXPelsPerMeter = 0;
bitmapCreateInfo.bmiHeader.biClrUsed = 0;
bitmapCreateInfo.bmiHeader.biClrImportant = 0;
PIXEL_DATA* pixelBits = (PIXEL_DATA*)malloc(width * height * 3);
HDC displayDeviceContextMemory = CreateCompatibleDC(displayDeviceContext);
HBITMAP ddb = CreateCompatibleBitmap(displayDeviceContext, width, height);
SelectObject(displayDeviceContextMemory, ddb);
BitBlt(displayDeviceContextMemory, 0, 0, width, height, displayDeviceContext, 0, 0, SRCCOPY | CAPTUREBLT);
GetDIBits(displayDeviceContext, ddb, 0, height, &pixelBits, &bitmapCreateInfo, DIB_RGB_COLORS);
// Now that the data is loaded, we can fill in the texture memory.
memcpy_s(textureMem, width * height * 3, pixelBits, width * height * 3);
DeleteObject(ddb);
ReleaseDC(NULL, displayDeviceContextMemory);
free(pixelBits);
return 0;
}
The managed code side of things isn’t worth putting here, it just determines width & height, calls GetNativeTexturePtr() on the target Texture2D to get a pointer and passes it all to the native function.
Here’s the issue : I get a crash that I’m 99% sure happens on the memcpy_s line. My thinking is, I mistakenly assumed the texture pointer led right to the pixel data (set to be in RGB 24 bits format). However, I came to realize that’s likely not the case as the texture should be loaded in video memory.
My question then is : what does the pointer actually point to ? An OpenGL texture handle ? I can’t seem to figure it out so help would be appreciated !
Thanks in advance.
PS : I’m aware that the native code’s design isn’t the best, and that I should probably have the pixel memory made to hold the newest “screenshot” of the main monitor persist and only get freed when the DLL gets unloaded, among other things… Just trying to get this to work for now.
Also, I DID attempt to go look at the example present in the Documentation here : http://39.104.106.62/Manual/NativePluginInterface.html but the link to the demo is broken.