Is it possible to take two photos (same static scene, but the cameras are offset just enough for parallax to be apparent), and generate a photo during runtime by interpolating any point between the two camera positions?
My ultimate goal is to bake an extremely high quality Unity scene into a stereoscopic 360 skybox for VR. 3D 360 content already exists (in ample volumes, even); what’s different about my proposed version is to have the parallax effect preserved in the photo/video being viewed, which IMO is the biggest that actually rendered static scenes still exclusively have.
That goal sounds incredibly complex, but the essence of the process can be narrowed down to simply interpolating or “tweening” between two photos of the same scene. At my current level of familiarity with the game engine however, I don’t know if even that can possibly be done in real time, which is shy I’d like to hear from all of you more experienced developers: can this be done?
If realtime frame interpolation can be done (and I do hope that it can), how might the effect be programmed? My initial sneaking suspicion is that shaders will have a big part to play; am I on the right track?