this may seem like an odd problem, but I have to solve it anyway
I have a sphere textured with a 360 video. The sphere is placed inside a 3D model of a building, the player is inside the sphere. What I want to do is project what the player sees onto the model behind the sphere.
My first idea was to somehow shoot a ray from the player through the sphere onto the geometry behind and project each UV coordinate (or pixel color) onto the geometry behind it.
Can anyone give me a hint on how to start with something like that?
Step 1: Delete the sphere. You don’t need it.
Step 2: Use a custom shader that uses equirectangular UVs to map the 360 video onto the surface.
Equirectangular UVs are the magic thing you want to search for. You’ll need a custom shader that takes the 360 texture, a 3D position (which is the position of the 3D sphere you had), and maybe a rotation amount for alignment. Then it’s just a matter of calculating the equirectangular UVs from the direction vector from that center position to the mesh’s surface position (normalize(IN.worldPos - _CenterPosition)) and you’ve got what you need.
I guess there’s the minor Step 1.5 of learn how to write shaders for Unity. But there are plenty of resources for that around.