Hello everyone,
I’m working on a 3D mapping project for surface-projected prototypes using ArUco markers. My setup includes a webcam, a projector, and multiple markers. The projector is fixed vertically alongside the camera, but the camera is slightly tilted to capture the entire game space when mirroring my laptop screen.
- I’m using two ArUco markers as reference points and one marker on top of a cube where the visuals need to be projected.
- I need to convert real-world coordinates into Unity.
- I want to develop a system that tracks the ArUco markers, determines the position of the marker on top of the cube relative to the reference markers, and ensures that the visuals in Unity follow the cube as it moves.
1.I have taken several measurements, but I am unsure where to apply them in my system. Could you clarify which measurements are necessary and where they should be incorporated? Additionally, why is it not feasible to determine the position using two ArUco markers instead of additional measurements?
2. Why is it necessary to measure the relative distance between the camera and the projector? In what part of the system or calculations will this measurement be used?
3. I’m encountering inconsistencies in Unity’s scaling system. When I input real-world measurements, the object positions do not always translate as expected. For example, setting an object at 0.15m moves it slightly from the origin, while another object using a different position moves the correct distance. Could you provide insight into why this might be happening and how to properly manage scaling in Unity?
4. How can I accurately position the camera in Unity to match its real-world placement? I tried setting the same field of view (FOV) as the physical camera, but the scale in Unity makes the FOV appear excessively wide. How should I approach this to maintain correct proportions?
I’d like to know whether real-world measurements are necessary and what steps I should take to make this system work effectively.