Hi,
I have the following scenario:
- Input: sequence of large still images (~ 3000x1000 px) incoming at 30 Hz from a camera (connected via Firewire). This camera does not appear as webcam to the operating sytem (at least not out of the box, I suppose one could code a fake webcam). Format is e.g. JPEG, but I could do any format conversion in a separate process.
- Wanted: whenever a new still image was grabbed from this camera, use it as texture in Unity (the texture is mapped to a mesh) with as little delay as possible
- Platform: Camera and unity render client are all on the same Windows 7 x64 machine.
I’m looking for suggestions how to approach this problem for this particular situation (e.g. I don’t give a rats ass how performance would be on Mac, Android etc.). So far, I understand that unity supports:
- import of textures from a webserver (I can’t imagine that this solution will be performing well, I’d expect the requests over HTTP to take longer than the ~33ms of time that are available)
- having my separate process encode the still image sequence as movie, and use a movietexture (unsure about performance, but I’d like to rule it out because of the high latency).
- writing a program that fakes a web-cam device, using the WebcamTexture feature of unity (is that a good idea?)
- your suggestions!
I hope you guys can help me out :).
Cheers!