Hello Community,
I am developing a system for digital sight in military vehicles by utilizing an Oculus Rift. We have several 1080p cameras which provide a 360° surround view. The Videos shall be displayed on rectangles like virtual LCDs. When they are positioned and scaled correctly it should be an impression like the vehicle would be topless ;). But we have two requirements:
- Up to ten full-HD streams must be transformed as Textures.
- Delay must be as low as possible to avoid virtual motion sickness.
Of course it’s easy to display a recorded video as texture on the virtual LCDs via Drag Drop but this is no real-time solution.
I tried to update the Texture by loading only one changing JPG-file (HD) but then I was just able to reach 0,3FPS instead of 40FPS when using video-playback.
Generally the cameras deliver a GigE-Vision interface and it would be nice if I could directly include the capturing of the streams as a Unity plugin. Another software or video-server would introduce further delay. The camera supports several coding styles:
YUV 4:2:2, RGB 24bit, Mono 14 (RAW data) all with a resolution of 1920x1080 @ 25 FPS
(German Datasheet)
http://www.cm-tech.at/upload/3502079-KappaZelos02150GV.pdf
I Tried to use the Manufacturers .Net DLL but it seems like some used Dependencies are not supported by Unity/Mono (e.g. System.Drawings) I would have to create an own wrapper for using the Plain C libraries. But even then I am not sure whether Unity is capable to handle the given Task.
Does anybody know a more elegant way to solve this problem or is my aim not achievable with Unity? In my mind it’s a lot of work to load so many large textures and render them at more than 25FPS.
This is my first Contact with Unity3d bit I am experienced in C, C++, C# and Visual Studio 2005-2013.