Texturing ARMeshManager Mesh

I’ve seen a few discussions on this with no solution as of yet and I’ve spent a week looking for as much info as I could find before coming here to ask for help.

I have built an app that allows the iPad Pro with LiDar to scan a room and export the mesh as a .obj. Right now I follow this process:

  1. The user scans the room and I apply a blank white texture to the mesh.
  2. Once the user has scanned the room they have to rescan the room and I convert the mesh vertices to screen space coordinates and apply the pixel color, from the camera image, to the vertex color.
  3. The object is exported to .obj with vertex colors.

This obviously results in a very low resolution image on the mesh. So, I’m looking for a way to apply the camera image to the mesh but I’m at a loss.

I’ve looked at using projectors and the Easy Decal asset on the store but neither works for this scenario. The projector goes through the mesh and does not stop at the first hit on the mesh and the same with Easy Decal asset. I really thought the projector would work as I can capture the camera position, rotation, fov, clipping planes etc and match the projectors properties to this and project the camera image at various points this way to paint the mesh.

I’m thinking there may be some custom shader work needed to make this happen but I’m really not sure where to start or where to look.

Ideally I’d like to be able to export the mesh and the texture which is mapped to the mesh. The texture being made up of multiple images taken from the camera or something along those lines, I’m just not sure.

Thanks for any help you can offer! :slight_smile:

Hi there I’m new to unity I just wanted to know how did you manage to save the rendered mesh data to an .obj file. I’m not that well versed I’m scripting as of yet.

I’d highly appreciate the help