I’m currently rendering custom geometry into a Unity Texture2D on an iOS device, via a hand coded plugin using an OpenGL frame buffer attached to the Texture2D native texture ID. The updated texture is rendering as intended when running in the Unity app.
What I now want to do is save the updated Texture2D to file via EncodeToPNG() . What happens is EncodeToPNG returns a teeny weeny byte array (~5kb) that I assume only holds basic PNG format, width and height data, as when I view the PNG pulled from the device it displays in Photoshop with the correct width and height but with no pixel data i.e is empty!
Another point of interest is if I call Apply() on the texture within Unity it renders an empty texture in app.
These points obviously make me think that simply rendering to the texture in my plugin is not actually updating any pixel data that the Unity Texture2D may own, but here I come to a loss as to how to update the Texture2D.
My first thought was to call Texture2D.ReadPixels(…) but as this is an iOS app the call I use to communicate the texture update to the Unity app from native code - UnitySendMessage - has a one frame delay and so the context is lost.
I’ve done a bit of investigation pulling the frame buffer data into a byte array using glReadPixels and updating the texture with this and I have a feeling I’m just duplicating the frame buffers work in a roundabout way as that fails too…
I would be very grateful for any suggestions; especially an understanding of the relationship between the Unity Texture2D pixel data and that which is manipulated via OpenGL commands via the native texture ID.