I have created a depth map in Maya 2016 from a certain scene. I want to get the distance values of the pixels from the scene and use it in Unity to create another scene, knowing those values.
I am not sure, if I should work with it only in Unity, or get help from Maya’s renderer.
I would then use this gathered data to create a custom output MESH:
(Unity - Scripting API: Mesh)
Starting with a simple x/y-plane mesh, I would deform each vertex’s z-coordinate value by the depth-map value of it’s rayast. Rather than use the UV values of the mesh to reference a texture, I would simply the process a bit by using a vertex-shader to display the mesh. This allow us to colorize each vertex of the mesh with the color detect at the end of the raycast (Unity - Scripting API: Mesh.colors).
Since the result is a single mesh, I would be interested to see the imperfect results as the resolution of the output mesh (and number of raycasts) changes. Best results would be, I suspect, 1 raycast per pixel of the original scene image.
The thing is, my input is pictures that I can populate a skybox with. That means either I bring in Unity the 6 pictures with their 6 depth maps and I work from here somehow or I have to make depth maps from these 6 pictures directly in Unity if there is a way.