
Controlling robots with brainwaves and hand gestures - antigizmo
http://news.mit.edu/2018/how-to-control-robots-with-brainwaves-hand-gestures-mit-csail-0620
======
Hydraulix989
Brain-computer interfaces are futuristic sounding, but you're still pretty
limited to 1D or 2D control. I can see why they have fallback to EMG.

Without something invasive like implanted trans-cranial electrodes or non-
portable like fMRI, you're never going to get anything past unreliable 1D
control with the noise you get from trying to capture highly-attenuated
microvolt potentials from the scalp.

I built a brainwave-controlled Pong game using EEG for fun for my university
project back in 2012:

[https://www.youtube.com/watch?v=HtJhbqliJYY](https://www.youtube.com/watch?v=HtJhbqliJYY)

~~~
stinos
_Without something invasive like implanted trans-cranial electrodes or non-
portable like fMRI, you 're never going to get anything past unreliable_

The time resolution of any existing fMRI implementation would be way to low
for driving anythin mechanical reliably. The implanted electrodes though are
being used already (lookup 'Obama handshake robot') and further researched.
Very, very interesting stuff. Also depending on the implantation location one
can theoretically get around the problem of EEG where people first have to
learn to control the robot by randomly trying some thoughts to see what they
do (because EEG just picks up everything): e.g. implant in the higher level
motor action planning areas directly (well, if functional in the patient) is a
better starting point.

~~~
alexgmcm
The electrodes also have the benefit that you can stimulate the neurones as
well potentially creating a full feedback loop.

It would be interesting to see if the brain could adapt to a new input and
learn how to interpret it.

Rat experiments seem promising: [http://www.sciencemag.org/news/2015/10/brain-
implant-lets-ra...](http://www.sciencemag.org/news/2015/10/brain-implant-lets-
rats-see-infrared-light)

~~~
YvetteBrooks
The delay between inputs of reading every neuron, reading and actually
performing a task will be the problem to solve until it opens up totally new
and fastest growing sphere for the science.

~~~
stinos
Imo there are more pertinent problems than delay. When the delays are
sufficiently small, say < 10mSec between neural signal and initiation of
movement of the robot, which is already possible to achieve now, you get
something which is really usable already and feels close enough to actually
making a decision and moving your own hand for instance.

On the other hand there's longevity of electrodes, making hardware smaller
etc. And then, probably biggest thing for now, the software side: developing
(possibly self-learning) accurate algorithms for decoding of the signals (both
single-neuron spikes + low frequency spectrum of the surrounding area) is just
hard. Also because a lot of things about the brain are in essence still rather
unknown it's sometimes a bit of programming in the dark.

------
aurelien
France does that before 8-P [https://www.industrie-techno.com/portraits-de-
jeunes-innovat...](https://www.industrie-techno.com/portraits-de-jeunes-
innovateurs-elle-controle-des-robots-par-la-pensee.50413)

