

Tiny Helicopter Piloted By Human Thoughts - smaili
http://www.livescience.com/37160-tiny-helicopter-piloted-by-human-thoughts.html

======
zsombor
Downscaling the myriad of signals from your brain to 64 electrodes controlling
four spacial dimensions seems tricky. I can see that making a fist may turn
left, but what prevents any other random thought of being interpreted in the
same way? I.e. how safe is a bike that turns left upon twisting the handlebars
left, you may end up turning right with odds of 1:1000?

------
droz
Wired ran a story last year on a Chinese team that did the same thing already:

[http://www.wired.com/gadgetlab/2012/08/zhejiang-
university-c...](http://www.wired.com/gadgetlab/2012/08/zhejiang-university-
china-brain-controlled-quadcopter/)

~~~
curiousducky
they mentioned that in the article

------
skolos
And here is how to make one yourself: [http://www.instructables.com/id/Brain-
Controlled-RC-Helicopt...](http://www.instructables.com/id/Brain-Controlled-
RC-Helicopter/)

------
curiousducky
How many directions can EEG control.

If this is just up and down based on your level of concentration, it is
relatively useless. However, if it maps all 6 directions, it is pretty
impressive.

~~~
damoncali
I wonder how many "channels" can be worked out? 2? 20? 2,000? 2,000,000?
Anyone have any background on this? It's fascinating to think of the
complexity of a machine that could be controlled, when compared to the
inherent limitations of pedals, wheels, switches and the like.

~~~
breuderink
I have a background in these so-called brain computer interfaces. Typically,
research know some of the properties of the EEG signal during specific tasks.
Using signal processing and machine learning methods (and sometimes human
training), these tasks can be recognized by 'decoding' the EEG.

Some of these signals are spontaneous (realizing an error has been made), some
are produced by voluntarily executing some mental task. Currently, the amount
of these 'channels' that is available is limited by 1) the amount of detectors
that a lab is willing to build, and 2) how many tasks the user can
simultaneously execute — which is typically very low. If you really want a
number, I would settle for four as the current state of the art.

I have been working on a method to make problem 1) so easy it can be solved by
laymen by just collecting examples of EEG during the task of interest. Now we
are founding a startup to make this happen commercially :).

PS: I think this technology does not lend itself well for analogies with
channels or buttons. Buttons were invented for a physical world. Brain-
computer interfaces lend itself to interact with signals there are /not/
available in normal interaction (i.e. relevance, errors, intended movements
etc).

~~~
damoncali
Cool. How would such things work when it came to a prosthetic leg, for
example? Suppose you built a robotic leg with, say, 80 or 100 actuators all
working together. Could you train such a device to work on thought, mimicking
a real leg, or is that out of the scope of what you're talking about?

~~~
breuderink
Hmm, difficult question. In the US there are some groups doing very advanced
work with implants to restore limb or prosthesis control, and there are some
very impressive movies of monkey's controlling robotic arms. But invasive
(i.e. with implants) work is not really my thing. And in these studies, often
the monkey is the one doing the learning — it is not the device that adapts to
the user.

For non-invase (i.e. EEG measured from outside the body) EEG I think that is
still far off. The problem is that the signals are measured from a distance,
and that it is very hard isolate signals from a precise region in the brain
which is needed for accurate control.

I typically express the performance of these brain-computer interfaces in
bits/minute. Keyboard gets roughly around 300 bits/min, brain-computer
interfaces 2-20 bits/min. I would not know the bandwidth (and latency)
requirements for reliable prosthesis control, but that would probably depend
on intended use of the prosthesis. But then again, not all the actuators need
to be controlled individually; maybe it is feasible with a smart controller
and a forgiving application. And of course usability plays a major role; I
cannot imagine controlling a prosthesis using the keyboard, although the
information throughput might be sufficient :).

------
vishaldpatel
It would be interesting to learn how they account for: "No, not that way, I
was just kidding."

------
killing_time
Whatever you do, don't think about getting hit in the face by a helicopter...

~~~
blutack
It's a Parrot ArDrone quadrotor. The frame is made of foam which encloses the
rotors and the motors automatically cut off if they sense the blades are
obstructed.

Being hit by one is like being bumped with a bit of polystyrene.

~~~
CodeFoo
He was being humorous.

(HN is so pedantic)

