
Chinese scientists unveil mind-controlled drone - kposehn
http://rt.com/news/mind-controlled-drone-china-157/
======
etrautmann
I'm a PhD student in a brain-machine interface lab at Stanford. This
particular demonstration is not all that interesting since EEG control is
extremely low bandwidth and particularly high noise.

Getting one or two degrees of freedom of control to turn a quadrotor is
possible, but will never become a robust or fast method of control. The
information content required from these EEG signals simply isn't there, and
what is there is frequently swamped by any muscle movements like blinking,
turning your head, etc.

It's possible to get much higher bandwidth and robustness with a cortical
implant [1]. These provide single-neuron sensitivity and make it possible to
record from several hundred neurons simultaneously, and achieve bitrates of
6-7 bps.

[1] <http://www.blackrockmicro.com/>

~~~
batgaijin
Uh, who the hell do you install that shit on?

Specifically, do people install this sort of stuff on a 'normal' person? Do
they accept volunteers?

~~~
ANTSANTS
Are you kidding? I'd estimate there is a small army of cyberpunk fans out
there who would jump at the chance to use a mind-computer interface to control
a robot.

~~~
devuatl
Couldn't agree more. People do craziest of things for the sake of body art so
there should be at least a couple of cyberpunk freaks out there with equal
eagerness for modifying their bodies.

------
tsumnia
After some digging, I can confirm that they appear to use the Emotiv headset
like munin hints on. While I never got around to diving too deep into my
Emotiv headset, the one thing I did notice is during my profile training, I
wouldn't get precise results without forcing some sort of neuro-muscular
response (pulling my head slightly back to move the block closer).

I noticed the person in the wheelchair showed control from at least the neck
up, so I assume there was some use of muscles to get that sort of response out
of the quadcopter. Thinking, without muscles, is actually really HARD! I
couldn't get past training without some assistance, and that was with maybe
2-3 actions (imagine ~10 different actions with unique thoughts needed)

Its still amazing work, and I'm glad to see some research in the field and
will definitely try to get a copy of their paper. I'd love to see the next
step to be either controlling the wheelchair with some good response time, or
to increase the number of actions possible.

------
brendn
At first I thought it said "mind control drone" and reached for the tin foil.

------
tsco77
Looks like an eeg neural device. Lots of interesting stuff on this in terms of
neural prosthesis. As suspect as the source may be, scholarly literature on
the subject suggests this well within the realm of possibility.

Here's a swedish study on the eeg robot control (I haven't read this article,
but the abstract suggests its approximate to the Chinese study):
[http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=13...](http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=1300798&url=http://ieeexplore.ieee.org/xpls/abs_all.jsp%3Farnumber%3D1300798)

Anyway, while the eeg angle for brain computer interfaces (BCI) are
interesting, bandwith limits are a major issue.

Kuiken and his staff have had some success with reinnervation, which also
returns some "tactile sense" capacity, but is still limited by the differing
commands for input (in this case free muscle groups).

[http://www.ted.com/talks/todd_kuiken_a_prosthetic_arm_that_f...](http://www.ted.com/talks/todd_kuiken_a_prosthetic_arm_that_feels.html)

Anyway, interesting stuff. Should be fun to see where this technology goes.

~~~
etrautmann
While this is certainly exciting, this is most likely an end of the road for
EEG due to the bandwidth limits you describe, as well as pollution in the
signal from EMG from muscle activity.

The prosthetics community is on track to deliver some amazing new capabilities
in the relatively near future, but this will almost certainly require
microelectrode or potentially Ecog.

See: [http://www.nature.com/news/mind-controlled-robot-arms-
show-p...](http://www.nature.com/news/mind-controlled-robot-arms-show-
promise-1.10652)

------
tokenadult
From an earlier pair of Hacker News comments about another submission from
this publication:

<http://news.ycombinator.com/item?id=4247829>

 _> The source of the submitted article, rt.com, is not known for careful
journalism.

Understatement of the year._

Just now as I have been searching for other sources, I see that all the
stories on this issue so far are based on a press release, in very similar
language, with the New Scientist blog write-up

[http://www.newscientist.com/blogs/onepercent/2012/08/thought...](http://www.newscientist.com/blogs/onepercent/2012/08/thought-
controlled-quadcopter.html)

perhaps having the most caution of statement in relaying the press release. So
far no independent journalist has done any reporting from the scene in China.

~~~
insickness
Dude, the video in your second link is awesome! To be honest, I was sorta
impressed with the floating robot alone, never mind the fact that he was
controlling it with his brain.

~~~
pcrh
I am peripherally familiar with the technology used to relay thoughts to
computers. I obviously can't comment on the specifics of the technology
described in the article, but the degree of control displayed in the video is
unlikely to be be possible with current widely-known methods of detecting
thought.

This doesn't mean that such a level is not ultimately achievable, or that they
may have some "secret sauce".

------
fareesh
Does anyone have any experience with an EEG headset like the one described in
the article, that they can recommend? Do they work as audio input devices or
are they independent input devices? Do they usually come with open source
drivers or libraries in case of the latter? Seems like an interesting thing to
tinker with.

------
munin
there's a high school science project where you make your own EEG (not that
hard, really!) and then use it to drive an RC blimp.

it would be even easier with the emotiv, which also gives you other face-
muscle sensors so you can do the eyeblink stuff. with a homebuilt EEG you can
also get a lot of data but there are few, as I understand it, signals that can
be easily 'user influenced'.

------
tedsuo
The drones in the video look extremely CG to me. Cool idea though.

