Hello, I’m considering a project to do for my robotics degree as below and I’m a little lost where to start looking for articles to see what is possible.
In short, what I’m thinking of doing is remotely controlling a robot arm using a VR headset to see what the arm is doing from another room.
The simplest way of doing this, is having a camera next to or on the robot arm that I view through the VR headset, but thats not much better than just looking at a screen.
What would be better, is using lidar to add the depth element to it, to aid in depth perception when picking up objects with the arm. Somehow creating a 3d VR environment in real time using the feed from a lidar camera. For this, I have an old Xbox Kinect sensor, and a new iPhone.
From the iPhone perspective, I think this would be pretty difficult to implement as I have no experience developing apps and I expect there is nothing out of the box I could easilly use to do this. My thought is there are a lot more university projects using Xbox Kinect, so maybe that has more examples of how this can be done.
Does anyone have any idea of whether this has been done before, the feasibility, and any examples that may be similar?
To clarify, It’s Batchelors degree, I have lots of CAD experience but no Unity experience as of yet. Not too concerned about how to control the robot, assume there would be some outputs from Unity I can use or use a separate system for that. It’s the building a real time 3d model for a VR view of the situation I think will be the challenge.