I have been looking for days and now I’m stuck. I really need help, this is really important, for my bachelor thesis.
So I created a panoramic camera out of multiple cameras with all 10 degrees FOV forming a cilinder. I have to put the output of all of them to one renderTexture. But I can’t figure it out.
So my cameras are looking just like in these examples:
I can so what this guy has done. By just creating a separate plane for each camera, put them all side by side, creating a renderTexture from each camera and put them on those planes accordingly.
HOWEVER! What I need is to have all the outputs of the cameras to be rendered onto ONE texture. Because I have a customly created mesh that is loaded dynamically and I have to put the renderTexture on that.
So far I have tried, creating a renderTexture for every camera, and then putting all those textures onto 1 other created texture, But I couldn’t manage to do it.
I also Tried merging the cameras by using only one same renderTexture as target for every camera. But I couldn’t get that to work either.
This is my custom mesh:
On the example I did put a test texture on it, which is taken my a 360 degree camera. So I need to do the same from in game.
(don’t mind the jpeg compression artifacts, it’s because of the file size limit here on unity answers)
So PLEASE anyone, could you at least give me some advice or ideas??
Even if it sounds stupid, please share, at this point I have nothing to loose with trying.
thanks in advance