Capturing the output an app and rendering it to a texture in another app?

Our product uses two unity apps, one is a viewer (OSX) and the other a controller (iOS). These apps talk to each other via RPC. I am looking for a solution that would allow us to display the output from the viewer app on a texture in the iOS app over WiFi. Kind of like remote desktop to a texture. This isn’t a strict requirement, we are just experimenting right now with various UX ideas, but I am interested in finding out if anyone has done this sort of thing or if it is even possible. I think capturing the frames and displaying them would be the easy part, but I have no clue how to send the output of a unity app over WiFi frame by frame. Any ideas?

I did find this handy plugin for dumping frames out of unity…

http://eccentric-orbits.com/eoe/site/ividcappro-unity-plugin/

Very cool plugin indeed. If this works, seems like in-app screen share for multiplayer games is possible.