Object / Motion Tracking

Hello,
I’m looking for a solution to track the motion of a drone. the movement in physical space should be mirrored on the virtual camera in unity in real-time. i can’t use a vive tracker since it would be to heavy for the drone. 2D image marker would be too small to be visible since the drone will be 10m away from the camera.
has anyone have a hint for an “easy” solution?

Led Pattern or IR led pattern on the drone and OpenCV comes to mind. Basically the same way VR trackers do it.

VR trackers (at least those with good tracking) does the opposite. You send out IR and you have sensors on the tracked objects that measures this.

Thanks! I was already thinking about OpenCV and will definitly have a deeper look into it. Unfortunately I don’t have valuable experience with affordable IR sensors besides the kinect. i think the kinect’s ir sensor can only cover a small range.
You think it’s possible to achieve a reasonable result with rgb camera + image segmentation/blob detection? Is there a “user-freindly” ML method for real-time object tracking in unity?

dont think ‘sensor’, think ‘camera’.

as in ‘night vision camera’ or ‘game camera’ or whatever. a bazillion of them on Amazon.

You could possibly even offload the opencv bit to a raspberry pi with a NoIR camera.

Image segmentation/blob? I believe no, definitely NOT possible. Same goes for ML - too computationally expensive.

What you need to do is detecting a pattern formed by leds and from that derive location/orientation of the object…

Also, it is not a Sensor, but an IR LED. Meaning it glows. IR leds are inviisble to the eye, but are very visible to the camera, espcially if it has night vision mode. So the camera sees LEDs and from them deduces object position.

Here’s an older oculus controller, for example:

Those glowing dots are invisible to human eye. They’re infrared.

1 Like

I think a bright IR LED on the drone for a start. Actually several, so that it can be seen from different angles. The only thing I think could accurately do this is the reverse of what I’ve done for a ballet. The camera was directly above and could track the dancers below in 2D from a standard webcam. I knew the height of the camera, so I didn’t need depth information which is not really possible to get anyway.

For a drone, I would imagine at least a two camera setup, one directly below aimed up. Hopefully the sun is not visible or that would ruin things. That gives you XZ coordinates. More cameras facing sideways could approximate Y. It’s not going to be great, because of lens distortion and I guess you’d have to compensate for perspective as the drone gets closer to the Y camera since it will appear higher in the image in 2D.

An Intel Realsense camera has an actual depth channel and one version is pretty tweakable for a longer distance, I think up to 20m or so. A drone is going to be pretty small though, so it might have trouble tracking it. With a depth camera, you wouldn’t need the IR LED light, I believe the camera sprays out IR dots and tracks what those dots hit. I do believe sunny conditions could make it not work though. I know for a fact a Kinect will not work outdoors at all because of this.

Do drones have any data about where they are? It might make sense to mount sensors to the drone and have the drone tell you where it is. I would think a sonic distance detector pointed down could give you decent altitude readings. It’s possible the sound of the drone interferes, so maybe investigate any sensors used by drones to track themselves.