In our current game (High Frontier), we want to realistically render planets in the background, but sometimes close enough that they span almost 180° of view. This is problematic. Suppose our camera (and space colony) are 400 km away from the surface of the Earth, which is a sphere with a radius of 6000 km or so. The camera’s far clipping plane can’t be more than 10 km at most (and 1-2 km is better). So obviously, we can’t just position a sphere at the Earth’s coordinates and let the rendering engine take care of it for us — it’s way outside the viewing frustum.
So our first thought was to just scale the planet down and move it proportionately closer. This works fine in less extreme cases (like when in high Earth orbit, 36000 km away). Then the Earth ends up a nice little ball positioned just inside the frustum, and life is good. But when it’s much closer (as in the first case above), this doesn’t work; moved inside the “yon” distance, it’s still so huge that it engulfs other important things in the scene (like the space colony that’s supposed to be orbiting it).
If the Earth weren’t changing, of course we could just ray-trace it into a skybox texture and have done with it. But it needs moving clouds, rotating continents, varied lighting with your position relative to the Sun, etc. So, a static image won’t do either.
So now we’re pondering more complex approaches of squashing the planet down to a shallow, inverted dish shape. Basically we would imagine a sphere centered on the camera, with a radius just less than the far clip distance, and we’d project every vertex of the planet onto that. If we take care to keep the original normals, and do something about triangles that were facing away, this seems like it ought to work.
But before I wander off into the weeds, I thought I’d ask here. Is there a standard solution to this problem? Something simple I haven’t considered involving two cameras or custom shaders or some such?
Thanks,
- Joe