Texture byte per pixel

Hi,
Is there a way to read the bytes per pixel from a texture?
I’m trying to send a texture to a C API and it crashes the entire application

            image.m_nBytesPerPixel = ???;
            image.m_pImageData = MyImage.texture.GetNativeTexturePtr();
            image.m_nWidth = MyImage.texture.width;
            image.m_nHeight = MyImage.texture.height;
typedef struct NotificationBitmap_t
{
    void * m_pImageData; // void *
    int32_t m_nWidth;
    int32_t m_nHeight;
    int32_t m_nBytesPerPixel;
} NotificationBitmap_t;

Both “Texture2D” and “RenderTexture” have a “format” property. The base class “Texture” does not.

Here is a list of all TextureFormats, as well as how many bits they use: Unity - Scripting API: TextureFormat

I’m not sure if it will work, but I recommend trying this:

switch (((Texture2D)MyImage.texture).format)
{
    case TextureFormat.Alpha8:
        image.m_nBytesPerPixel = 1;
        break;

    case TextureFormat.ARGB4444:
        image.m_nBytesPerPixel = 2;
        break;

    // ... implement other formats
}

wiill have a look. Thanks. I didn’t get it to work with any byte per pixel though. Alps tried getting the byte buffer and getting the ptr to the array and use that, didn’t work either

Pretty sure the native library want a RGBA bitmap though since the name of the structure is NotificationBitmap_t

I wonder if there is a problem with mono and pointers, I have gotten it to work with a vanilla .net project.

There is a difference the pointers are hex in vanilla .net and decimal in mono,. dont know if its relevant.

Mono
4183378--370129--upload_2019-2-5_10-30-32.png

.net
4183378--370126--upload_2019-2-5_10-30-15.png

I do this precise operation in my KurtMaster2D game, on Android, iOS, MacOSX and Windows.

Here is my code to allocate and pin the texture, and pass the array of bytes into native code.

Native code then renders to this buffer, and when it comes back I marshal back into a Texture2D for display.

    public static void PinTextureAndCallNativeRenderColor( Texture2D t2d, int span, bool clean = false)
    {
        if (clean || (pixels == null) || (span != pixelSpan))
        {
            pixelSpan = span;
            pixels = new Color32[ pixelSpan * pixelSpan];
        }

        GCHandle handle = GCHandle.Alloc(pixels, GCHandleType.Pinned);
        try
        {
            dispatcher1_rendercolor( handle.AddrOfPinnedObject(), span, 0);
        }
        finally
        {
            if (handle.IsAllocated)
            {
                handle.Free ();
            }
        }
    
        t2d.SetPixels32( pixels);
        t2d.Apply();
    }

Here is the interop signature for that native method:

    [DllImport ("__Internal")]
    private static extern void dispatcher1_rendercolor(
        System.IntPtr pixels, int span, int flags);

And finally the actual native C code signature is (actually, here’s the entire native C function):

DECORATE_FOR_DLL    void    dispatcher1_rendercolor( void *rgbpixels, int span, int flags)
{
    if (wbi.workbuf_p)
    {
        int w = wbi.maxx;
        int h = wbi.maxy;
        unsigned char *ucptr = (unsigned char *)wbi.workbuf_p;
        unsigned int *uptr = (unsigned int *)rgbpixels;
        int    i, j;
        for (j = 0; j < h; j++)
        {
            for (i = 0; i < w; i++)
            {
                *uptr++ = kpios_last_palette_256_rgba[ *ucptr++];
            }
            uptr += (span - w);
        }
    }
    else
    {  // this simulates noise from a blank TV channel
        int w = span;
        int h = span;
        int    i, j;
        unsigned char *cptr = (unsigned char*)rgbpixels;
        for (j = 0; j < h; j++)
        {
            for (i = 0; i < w; i++)
            {
                *cptr++ = rand();
                *cptr++ = rand();
                *cptr++ = (rand() & 0xff) / 4;
                *cptr++ = 255;
            }
            cptr += 4 * (span - w);
        }
    }
}

NOTE: the “engines” inside this game collection are mostly 8-bit color, hence the palette expansion you see.

NOTE: this same code runs on all targets: iOS 32bit, 64bit, Android 32bit, 64bit, MacOSX 64bit, and Windows 32-bit.

And you can see it all in action here in the final product:

Appstores:

Apple iTunes: ‎KurtMaster2D on the App Store

Google Play: https://play.google.com/store/apps/details?id=com.plbm.plbm1

Thanks. it seems to be a problem with the x64 dll. So Valve needs to fix that. It works with the x86 dll, but our game is x64 so need to use the x64 dll