Read back texture with projection from mesh

Hi!

In my scene, I’ve got some objects (both planes and spheres for example). I also have setup 2 projectors with a custom shader to project RGB textures on the objects in the scene. All works well, both when moving projectors as when moving objects.

Now I would like to read back the end result that’s visible on the objects, as a texture. In realtime, taking in account the UV mapping of the objects. Would this be possible? Anyone?

Not really
(Note: not a computer engineer but…)
The texture projection occurs on the GPU (shaders), after data is sent from the CPU (mesh, etc). There’s no way to get data back after that (because its now just pixels) for you to do stuff with it via scripting.

This is what rendertextures are for, but they don’t really detail things like uv mapping of the objects

Another way, could be to duplicate each mesh to an offscreen section. Then use a custom shader that:

  • Projects the texture according to the transform of the original mesh, translated by the matrix of a “projector”-camera to make the texture projected on the object
  • Unwraps the UV to make a flat object out of it

When rendering this offscreen section to a rendertexture, I have the projected unwrapped texture for that object, which I can also put back on the original object in the original scene to visualize.

Am I right with this method? Anyone who likes to help with some (shader)code?

I am facing a similar problem. I was able to get the result of a projection into a render texture. But i don’t want the mesh in the result. I just want the UV mapping output without the mesh in the back of the result.