Vive has a neat feature where you can display a non-stereo view of what the player is seeing in stereo, on an external display. I’d like to do something similar on iOS.
I know iOS has a nifty AirPlay streaming feature, but this shows exactly what’s on the screen, i.e., a stereo view with lens distortion.
Is there any clever trick for streaming (or recording — that’d also be acceptable for some purposes) a flat, non-stereo view of what’s going on in the game?
No ideas on this one? I still haven’t thought of any good solution.
I guess I could basically turn my game into a network game, make a desktop version for it, network them together, and have the desktop update all game objects (including the camera) from those on the mobile app. But man, that seems like a lot of work.
I’d rather just set up a second (non-AR) camera in the iOS app, that somehow streams its view to a file. Doesn’t satisfy the “watch somebody play on the living room TV” need, but it does satisfy the “make demo videos” need.
But how do I do that? Is there any approach better than using a RenderTexture and saving the result to a series of images, as fast as I can?
…or is it possible to connect a (real or virtual) external display to an iPhone, and just throw content up on that using Unity’s standard multi-display support?
Is the source available for Unity Remote. You might look at it, or a previous version, if you can find it.
I bought this asset a while back to see what it did.
I can’t quite remember how it got the image (probably render to texture), but it just serialized and transmitted video as a bunch of individual pictures (eg. jpg). I was (still am) looking for something that would follow an established streaming protocol so that my game would act as a server that multiple clients could connect to and watch play.
Good luck and update this thread if you find something. ffmpeg or OBS studio might have something you could use.
1 Like