Capturing the real-time output of multiple cameras in a scene

If there are 20 cameras in a scene and I need to capture the outputs of those cameras with different resolutions at least 15 fps. What would the best solution?

In other words, if I have a powerful enough machine, can Unity render multiple outputs at the same time? What are the constrains?

Preferably I don’t want to split my screen and show the camera outputs next to each other and get the camera outputs, because in that case I am limiting my self to the screen size.

The other way, would be capturing each output in one frame, for example, if the scene is rendered 150 fps. I can get 10 cameras’ output, each in a frame.

But all the mentioned solutions are not applicable for me. I even don’t need to display the outputs. as far as I can store the outputs in the memory or file, it solves my problem.

Prepare:

  1. Assign RenderTargets to each camera
  2. Set desired resolution on each render target

Then in each frame do a loop through all cameras and perform this:

  1. Set render target active (or camera active?)
  2. Create new texture the same size as render target. This will tell next step which camera is should read.
  3. Do Texture2D.ReadPixels. This steps copies pixes from render target to the texture
  4. Optional: Texture2D.EncodeToPNG and then write them to disk, but this will slow down things a lot.

In general capturing one camera to disk is a slow thing, so capturing many will be a problem. Capturing to memory might be better, but you will fill your memory fast, unless your resolutions are small.