
Scientists Watch Thoughts Form in the Brain - Cozumel
http://www.scientificamerican.com/article/mind-aglow-scientists-watch-thoughts-form-in-the-brain/
======
bsbechtel
For all the hype surrounding AI, virtual reality, and augmented reality as
being the next big trends in tech, I really feel like brain/computer
interfaces are what is really going to change our world in unprecedented ways.
The technology is still a long, long ways from having a visible impact for
most people, but I am more interested/excited/scared about this technology
than anything else going on in the tech world right now.

~~~
ericjang
There are some very difficult challenges in brain-computer interfaces (BCI),
some of which involve huge barriers presented by human biology. It's rather
depressing.

Presently, there is no good way to read data from the brain to a device. You
can drill a hole in your head and patch-clamp some electrodes to your brain
tissue, but this procedure is only recommended for treating debilitating
diseases like Parkinson's. Furthermore, there is the issue of scar tissue
developing over time as a small region of your brain is being bombarded with
electricity / contact with foreign metals that trigger immune response.

I do think a spinal tap of some sort might make more sense (rather than
patching directly into brain tissue like in the Matrix), but you still get
scar tissue.

Non-invasive methods don't have scar tissue, but it's unclear whether
measuring the surface activity of the brain yields sufficient information.
Furthermore, they are highly noisy due to ambient static and muscle activity.
You shouldn't move much if you want o.k. readings.

The BCI dream is to have a lightweight headset that you put behind your ears
like in Iron Man 3 or Big Hero 6 (and then you control robots with your mere
thoughts). However, I think that kind of technology is _really_ far away,
short of some huge breakthroughs in compressed sensing and possibly
superconductor physics.

Even more difficult than reading data is writing it back in. There have been
very early results with getting monkeys to telepathically control each other's
muscle movements, but my understanding is that it's not very reliable.

A reasonable survey can be found here:
[http://www.sciencedirect.com/science/article/pii/S1110866515...](http://www.sciencedirect.com/science/article/pii/S1110866515000237)

~~~
aperrien
Optogenetics possibly provides a solution to the interface problem. When you
can use light to read from and send signals to neurons, you don't have to
worry about scar tissue at all, as far as I know.

~~~
ericjang
I don't think optogenetics is feasible for human BCI. Optogenetics requires
one to express channelrhodopsin proteins in the neurons of interest - which
means you need to genetically modify the human at birth. We know how to do
this in mice, but not humans since the gene expression pathway for neural
proteins in humans is poorly understood (unlike mice and zebrafish). Even with
optimal designer baby scenarios, there's no hope for us adults; gene-editing
therapies for adults is way harder and we'll likely be long dead before we see
anything close.

~~~
aperrien
CRISPR techniques allow us to modify gene expression in organisms long after
birth; it has been heavily used in adult animals, but I don't think it's been
used in humans just yet.

~~~
ericjang
CRISPR is cool, but isn't the challenge of adult genome editing bottlenecked
by the delivery vehicle? Last time I checked it's still hard to deliver viral
vectors past the blood-brain barrier. Does CRISPR have a unique solution to
that? I also would imagine that a "BCI" optogenetic adapter would be so
massive that payload size alone (never mind electric properties) would never
make it past the BBB.

------
danielmorozoff
Nature write up: [http://www.nature.com/news/brain-s-chemical-signals-seen-
in-...](http://www.nature.com/news/brain-s-chemical-signals-seen-in-real-
time-1.20458)

And paper without paywall from kleinfelds lab:
[http://physics.ucsd.edu/neurophysics/publications/nmeth.3151...](http://physics.ucsd.edu/neurophysics/publications/nmeth.3151.pdf)

------
edmundhuber
To add to ericjang's comments elsewhere in this thread, here are my thoughts
on the two most popular BCI (although I'm used to seeing BMI -- brain-machine
interface) technologies:

* Optogenetics advantages: don't need to insert things into tissue. Disadvantages: requires genetic engineering that is not approved for human use. Difficulty reaching deep brain targets. To image many neurons, might need to pump lots of light into an area, causing heat and cell death. Bulky.

* Microelectrode arrays ([https://en.wikipedia.org/wiki/Multielectrode_array](https://en.wikipedia.org/wiki/Multielectrode_array)) like the Utah or Michigan array advantages: direct electrical recordings, relatively compact. Disadvantages: low density, invasiveness.

It is worth noting that most behavioral studies are done with MEAs are due to
their convenience/form factor.

It is also important to note that these two technologies measure different
things: an MEA can record extracellular voltages, so it's a relatively direct
measurement of neural activity, compared to optogenetics, which relies on ion
channels fluorescing, which is not exactly what most people really want to
measure.

A word about invasiveness: when you insert large (>1um) things into tissue,
you elicit a foreign body response, starting with inflammation and ending with
scar tissue or gliosis. The issue with invasive probes like MEAs is that even
if you manage to avoid major blood vessels and you control bleeding and
inflammation, ultimately scar tissue forms around your probes. Scar tissue is
a much worse electrical interface than plain tissue, and thus the
effectiveness of your BCI/BMI implant diminishes.

Disclosure: I run a company (Paradromics, www.paradromics.com) developing
next-generation microwire technology for brain-machine interfaces. Also, my
strength is in software so while the gist of what I'm saying is probably
mostly correct, I can't give details as well as others can.

~~~
djoshea
Just to add clarification points, the term optogenetics is generally supposed
to refer to any gene transfection that involves optical access, but in
practice there are two separate categories:

* optogenetic stimulation: using light-activated ion channels to modulate neural activity

* optical imaging: using genetically targeted fluorescent proteins (mainly GCaMP) to observe the activity of neurons

This article is more the second kind, and the novel aspect is that this
technology couples dopamine release to calcium rises, enabling imaging of the
calcium signal to be used as a proxy for the neurotransmitter's release. It's
more likely to be quite useful for investigating specific scientific questions
about neurotransmitter signaling than a general purpose readout for BMI.

Multielectrode arrays pick up spikes or action potentials from individual
neurons directly, so you get high temporal resolution but from a sparse sample
of cells. This sample turns out to be enough to do a lot of interesting
things, such as controlling a mouse cursor or a robotic arm. This is already
in clinical trials at a few sites, including (my lab at) Stanford.

Imaging approaches are really powerful, but the signals are often slower. This
has to do with the kinetics of the proteins themselves and the signal that is
being detected (calcium is slower than voltage transients). But you get to see
the activity of lots of neurons. There are voltage sensitive channels that you
can image, although the signal to noise ratio isn't nearly as high as yet.
It's not immediately clear how this could be used for a BMI in humans, mainly
because you would absolutely need optical access to the brain to image, so
you'd either be opening up a window or implanting some kind of imaging sensor.
The less invasive approaches you were hinting at are mostly in the first
category (stimulation), where for longer wavelengths of light you wouldn't
need something as invasive.

------
nacc
Just to clarify, optically recording neural activity from ensamble of cells
("thoughts") has been done in neuroscience for many years using voltage /
glutamate / calcium indicators + imaging. CNiFER is a breakthrough is because
it provide ways to measure the "hormone" of brain, these are often molecules
in tiny amount that modulate how cells in an area fire.

A not-so-good analogy is before we can measure electrical current in wires,
now CNiFER allows us to measure electromagnetic field. This will help to
explain why some headphones nearby will beep when your cellphone received a
message.

------
mhneu
Calcium imaging is currently a better method for realtime imaging of neural
activity.

[http://www.nature.com/nmeth/journal/v6/n12/full/nmeth.1398.h...](http://www.nature.com/nmeth/journal/v6/n12/full/nmeth.1398.html)

The CNiFER innovation is the ability to visualize neurotransmitter activity.

~~~
kevinalexbrown
strictly speaking, this is calcium imaging

------
mhuffman
I expect this to be streamlined and used by law enforcement as an alternative
for polygraphs.

~~~
cubano
There seems to be a huge push against using _any_ type of nuero-tech to build
"ultimate" lie detectors, which I often find puzzling.

~~~
tim333
Lie detectors do not have a good track record for making things better. If you
want to catch criminals there are many other ways. If there was a murder near
you would you want people trying nuero-tech on you?

------
JackFr
The technique described here -- that of the CNiFERs is impressive. But the
hype is way over the top.

> When many [neurons] fire together, they form a thought.

Is that a definition, an observation, a hypothesis or a result?

~~~
aperrien
I think that depends on where you believe thoughts originate. If they
originate within the brain, the ensemble firing together (in some sort of
organized manner) is a thought. If you believe that they don't originate from
within the brain, then the answer is unclear. More of a philosophy problem at
that point, I think.

~~~
habitue
If you don't believe thoughts arise in the brain, then you're ignoring
thousand year old results in basic physiology

------
_Wintermute
Article on the CNiFER method:
[http://www.nature.com/neuro/journal/v13/n1/full/nn.2469.html](http://www.nature.com/neuro/journal/v13/n1/full/nn.2469.html)

------
dschiptsov
I think it should be called "digital hallucinations" \- non-reproducible (but
not totally random either) visualizations, renderings of some data gained from
a device build according some biased and grossly oversimplified models. It
might have strong correlation with micro changes in blood flow or
biochemistry, etc. but obviously have nothing to do with thoughts.

~~~
Balgair
Did you read the same paper?

~~~
dschiptsov
I have got some rather superficial knowledge of _how_ the state-of-the-art
brain imaging works from a couple of free MIT courses.)

------
astazangasta
I've never understood how these guys managed to get a billion dollars to fund
this horseshit Brain Initiative in a time when NIH funding has been declining
for decades. There is probably some serious graft/nepotism behind this.

------
importantbrian
This makes me think of John Scalzi's The Ghost Brigades. I wonder how far it
is between mapping thoughts to being able to map someone's memories and
consciousness. If such a thing is even possible at all.

~~~
wang_li
In current MRI images of the brain, a million different data points are
reduced to a single pixel in the image. Absent some kind of dramatic
breakthrough in science, we are a long ways from being able to resolve
individual data points inside of a person's brain.

------
rasz_pl
Did they test this method on a dead fish just to be sure?

------
ommunist
I wonder how they distinguish thoughts from feelings?

------
bantunes
I'm scared. The Valley is gonna "disrupt" stuff in a lot of nasty ways armed
with a more developed version of this...

------
saintPirelli
While reading I was distracted by that awesome scrolly thingy at the bottom.

