
Real-time Drone object tracking using Python and OpenCV - perone
http://blog.christianperone.com/?p=2768
======
hemangshah
There are visual tracking techniques that are not only more stable and robust
that meanshift, but can also handle occlusions, adapt to changes in appearance
models, etc. Have a look at the VOT challenge[1] for the state-of-art in
visual tracking.

[1]
[http://www.votchallenge.net/vot2014/download/vot_2014_paper....](http://www.votchallenge.net/vot2014/download/vot_2014_paper.pdf)

~~~
perone
Thanks for the link, I'm aware of some of the methods cited. This first PoC
was just to check how fast and how good Meanshift would work with that image
quality. I'll try to implement some more robust approach, specially some
method that could handle occlusions.

~~~
hemangshah
Christian, please also be aware that some of the techniques do not incorporate
changes in scale of the tracked object. If you are working with aerial footage
then this might be an important consideration. I've recently worked on a
similar problem, so you need any help, please feel free to email me (email
address is given in my profile)

~~~
perone
Thanks for the feedback ! I'm planning to use more robust and scale invariant
features for this so this problem of different levels of zoom would be solved
(at least to the limits of what can be done).

------
adriancooney
This isn't exactly "real time" since he loads an MP4 file
(`cv2.VideoCapture('upabove.mp4')`). I think it would be a lot more impressive
if the drone was able to stream the film to a server of some kind, process it
there then make decisions on it's flight path.

~~~
infinitone
That's what I was thinking too. Also, isn't this sort of thing pretty common
now? As in, aren't there startups that have drones that follow you to take
cool video of you [1]. I'd imagine they are doing some sort of realtime
tracking.

[1] [https://www.airdog.com/](https://www.airdog.com/)

~~~
andymoe
None of the "follow me" drones folks are really doing object tracking yet.
They basically just "fake it" by sending a bunch of guided waypoints with the
offset of the phone or bracelet GPS location in airdogs case and hope it looks
OK because of the really wide angle lens of the GoPro. There is a _lot_ of
room for improvement in this area that could be made by feeding that realtime
tracking data back into the flight controller as you suggest.

The one startup in the drone space that is really putting computer vision
stuff to work on drones is [http://skyd.io](http://skyd.io) \- some ex google
project wing folks. The demos are freaking amazing. Wait long enough on the
website background video and you can see a six foot wingspan UAV flying itself
through a parking garage.

Source: I live an breath drone stuff and write a UAV ground station app for
iOS to go along with our Apple MFi certified hardware for the long range
telemetry connection from iOS. It works with OPs flight controller by the way
:-)

~~~
michaelt
Are there many drones out there carrying enough processing power to do serious
machine vision (SLAM, obstacle avoidance) in real time from the drone's
onboard sensors?

~~~
zo1
I'm sure there is... It's a problem begging to be "solved". The majority of
juice on a drone is taken up by the propellers, not the board running it.
Though, to be fair, at that point "weight" is a bigger concern than
electricity usage.

------
PanMan
This works better than I expected, but I'm wondering how well it will work if
the angle changes (flying around the building) or other parameters change. But
looks impressive for how little code it is!

~~~
zo1
If the angle changes, the details of the building's facade changes as well.
You can see in the code that he isolates the building's details from the
_first frame_. Then compares later images to it to do the "tracking".

