
One rat brain 'talks' to another using electronic link - scholia
http://www.bbc.co.uk/news/science-environment-21604005
======
creamyhorror
One possible path of development is toward the neural jack - basically a
standardized plug port, a USB interface for the brain. Here's a page
describing some Japanese research toward it:

[http://www.virtualworldlets.net/Resources/Hosted/Resource.ph...](http://www.virtualworldlets.net/Resources/Hosted/Resource.php?Name=FirstNeuroJack)

I imagine the following: the jack outputs nerve impulses from the brain. Feed
these impulses into a robot's control interface/"API", and the user's brain
will be able to adaptively learn to control the various functions offered
(e.g. flick a finger to make the robot's hand perform the same action). Even
better, the robot uses a learning algorithm to associate the nervous data with
the correct action through feedback from the user, so that eventually, the
user's moving his hand causes the robot to do the same. Voila, exo-suits
controlled directly by the brain.

If you can pass data back into the brain through the jack, you can return
tactile feedback from the robot arm, or even sensations that represent other
information, e.g. light or gas levels in the environment detected by
mechanical sensors. I doubt human brains can process information that doesn't
fit in our spectrum of senses, e.g. mathematical patterns, or abstract
conceptual knowledge like an understanding of democracy, but maybe they can
accept visualizations or memory vignettes generated by other users. Brains are
adaptive and can learn to process unfamiliar stimuli.

It's all far in the future and raises a lot of strange and worrisome
questions, but it's interesting to think about.

~~~
chatmasta
Actually, human brains have been shown to be able to recognize patterns
generated by non-standard sensors. Our brains seem to encode sensory data into
a standardized protocol before processing it. For instance, in one study
researchers encoded visual data from a camera onto a thin tape on the tongue
with a grid of pressure points. Subjects received visual data through their
taste buds and were able to correctly process it (i.e., catch a ball). I can't
find the exact study right now but it is out there.

This is partially the basis of "On Intelligence" by Jeff Hawkins, which led to
his founding of Numenta (disclosure: I worked there last summer).

~~~
saulrh
This was originally demonstrated by Professor Bach-Y-Rita in his work on
neuroplasticity. Things other than cameras work just as well, interestingly;
they've used the same kind of system to help people with inner ear damage
learn to balance, give people built-in compasses, and to give people radar and
sonar "vision".

<http://en.wikipedia.org/wiki/Paul_Bach-y-Rita>
<http://en.wikipedia.org/wiki/Brainport>

------
nameless_noob
It's nice that the paper is freely accessable to boot.

to be convenient:
[http://www.nature.com/srep/2013/130228/srep01319/full/srep01...](http://www.nature.com/srep/2013/130228/srep01319/full/srep01319.html)

One observation, that the bbc didn't note, is that the decoder brain in the
end had a representation of the encoder's whiskers.

To be a bit hyperbolic, the rat started to think it had two bodies. Or maybe
it thought it had an extra set of whisker's some where. I can't quite imagine
what the rat's body image would be.

~~~
stcredzero
Makes me think of Steven Wright's bit about the light switch in his house he
couldn't figure out the purpose for, so he flicks it on or off when he passes.
After a month, he gets a letter from a woman in Germany saying, "Cut that
out!"

------
sdoering
I#m so utterly torn on this. On one hand, this sounds like the first step to a
matrix-like learning tool - downloading information into the brain. And is in
a nerdy way quite cool.

But on the other hand this totally creeps me out. I am disgusted, that
something so possibly sinister is being researched. Something with such a
potential for misuse.

After reading this quote: "We will have a way to exchange information across
millions of people without using keyboards or voice recognition devices or the
type of interfaces that we normally use today," he said.

I just thought:

"We are the Borg. We will add your biological and technological
distinctiveness to our own. Resistance is futile."

~~~
ThomPete
I think you are forgetting the upside to this.

Imagine what it does for peoples empathy when they can feel each others pain.

~~~
krapp
What you're describing also sounds like an effective psychological weapon. It
depends on the application but I can imagine (not condone, but imagine) this
being mandatory for, say, felony offenders in a prison setting. You've got
nothing to do but sit in a cell awash in a cold tide of human suffering.

Especially if this technology becomes advanced to the point that you don't
need the network directly, and you can just in essence record someone's
emotional state then broadcast it. Instant purgatory.

Then again, if you can feel someone else's pain, presumably you can feel their
pleasure too...

~~~
Valid
A prison where the prisoners constantly feel dread and suffering? Welcome to
Azkaban.

~~~
stcredzero
How about, "Welcome to Prison?"

------
ths
I wonder if this could have some great research applications. A year ago I
watched some lectures from Stanford on ethobiology (i.e. the branch dealing
with the biological processes underlying behavior) on YouTube by Robert
Sapolsky, and he talked a lot about the difficulty of figuring out what parts
of the brain (and which interplays of brain centers) are responsible for
behavioral patterns, especially when it comes to the more complex things. One
joke was something like: "You know that feeling when someone calls you and you
don't really want to talk, but feel uncomfortable with saying that actually
you're busy and would rather just read a book than talk? I think we've found
the brain center for that." We have learned a lot from what happens when
people have certain parts of their brains damaged. But maybe we could learn
much, much more about the brain by being able to fiddle around with many
different kinds of signals to different brain centers, and trying out
hypotheses by stimulating several centers simultaneously in order to produce
(or not produce) certain behaviors?

------
dchichkov
Reminds me Kevin Warwick's project. I take it both he and his wife implanted
electrode arrays into arms (interfacing to a median nerve). And they've been
able to perform simple communications. First direct and purely electronic
communication between the nervous systems of two humans. It was done in 2002.

------
jbattle
I'd be more impressed if they hadn't trained the second rat beforehand. The
second rat may be perceiving the stimulus as anything - a pressure in it's
tail, who knows? The second rat is trained to press a button when it feels a
pressure in it's tail.

I just don't see a clear path from this to true 'communication' - unless it
takes a form like morse code that one (human) mind taps out and the other
(human) mind consciously decodes. But I know nothing about this either ;)

~~~
jcarreiro
The "decoder" rat was not trained beforehand, unless I am badly misreading the
article. The training process that occurred used the real-time signal from the
"encoder" rat's brain. It simply took some time for the "decoder" to learn how
to process the signal sent by the other rat.

~~~
aidos
I think he means trained as in, was taught that by pushing one of the levers
it got food.

As you say, there was additional training to then handle the encoding /
decoding.

------
swalsh
This is just the first step in a potential future that could be huge. The
consequences of which could be so persuasive i'd almost question if we could
still call ourselves human. Huge changes that could help Communication
improve, and general intelligence improve. Think about how much time people
waste simply trying to communicate their minds vision to each other. This
literally could be the most effective form of communication. Additionally
imagine if its not a person on the other end, but a computer with artificial
intelligence. It would literally be like having Google built into your brain.
That's the intelligence explosion. Of course, if only a select few people have
it, the digital divide would be huge.

~~~
krapp
If this becomes ubiquitous, then two of the most fundamental states of human
nature (we are each alone within ourselves and our minds are obliterated in
death) might be shifted. That would certainly mean redefining what 'human'
means.

Although this would also mean the transhumanists were right all along which
just annoys me.

------
mcantelon
Reminds me of DARPA using the subconscious human brain as a peripheral for
pattern recognition.

[http://www.technologyreview.com/news/507826/sentry-system-
co...](http://www.technologyreview.com/news/507826/sentry-system-combines-a-
human-brain-with-computer-vision/)

I wonder whay human brain clusters will be used for? Stock market analysis?

------
meaty
I know this isn't Slashdot, but imagine a Beowulf cluster of them :)

~~~
tjr
Conjures up memories of the old hamster dance website...

~~~
stcredzero
If you wired up hamsters like that, you might be able to get them to dance in
unison. The gifts are preferable, but still execrable.

------
6ren
> a way to exchange information across millions of people without using
> keyboards or voice recognition devices

Brain-to-brain communication will be convenient, because we won't need
physical I/O devices. However, I wonder if there will be any advantage to
interfacing any deeper into the brain than those corresponding nerves... that
is, perhaps direct-brain interfaces will be pretty much like conventional I/O,
just without the physical I/O device (like the cochlear implants/bionic
eyes/bionic arms of today). So we can't improve _fundamentally_ over things
like google glass/MYO.

The reason I say this is because we already have brain-to-brain communication
in the form of language. This involves reducing our internal representation
(whatever it is) into a serial format that follows grammatical rules - not
necessarily "proper" grammar, but sufficient to enable the recipient, using
shared knowledge, to decode the meaning. This has striking parallels to
network communication: we have the internal representation of data structures
(e.g. objects), which are serialized to a low-level protocol (e.g. xml, json,
yaml, protobuf, asn.1) which follows a higher-level grammar (explicit like xsd
or implicit as in json). Well, not entirely surprising, since Chomsky's
grammar hierarchy was based on human language.

 _The interesting question is whether brain-to-brain communication can be any
more "direct" than the serial representation (language) we already have..._
That is, are our internal representations similar enough that they can be
"directly" communicated; or are they so different between individuals that
converting between them will amount to going through a common grammar in order
to interface between them?

Favouring direct interface is the argument that serial representation was only
needed to communicate over the distance to another human - i.e. because we
didn't have direct brain-to-brain contact - and it is not needed apart from
that. Also in favour of direct communication is that if individuals are
reasoning about the same things in the same ways, our internal representations
must be equivalent in some sense. Other hidden aspects of ourselves also turn
out to have strong similarities (such as internal organs); and where they
vary, they often vary in strong groups (such as blood types), so perhaps our
hidden mental features will also be similar.

Against this is the observation that everyone's perspective is unique -
literally, where their eyes are - but also of course their figurative
perspective, resulting from cumulative experiences, genetic background,
internal thoughts and randomly different wiring/interpretations. Therefore, we
_don't_ have the same internal models. We don't even have different models of
the same reality, because we perceive reality differently - our models are of
a _different_ reality. And this diversity is one of the great creative
strengths of our collective intelligence, that different people have different
insights into the same problem. Another argument against this is that grammar
is not only used in communicating to others, but is also used to communicate
to ourselves - when we talk to ourselves, debate a course of action
internally, "marshal our thoughts" into words. There's even an argument that
the full power of human thought _requires_ language - it's not just a
serialization format, but is thought itself. By this view, even if we can
communicate directly, it would only be associations, senses, emotions,
recollections etc. We couldn't communicate an _argument_ , for example.

 _SUMMARY_ our internal representations will differ at least to the extent
that we interpret things differently (i.e. that we live in a different
reality). We will probably fall into groups, like blood types; also by
profession/training/personality/social group. i.e. between persons "on the
same wavelength". This is a limit on how "directly" communicate, i.e. without
requiring a shared intermediate general-purpose language.

 _tl;dr_ qualia

------
helloamar
Few days back I read an article that the flower communicating to the bee. Now
this. Mother Nature is awesome.

------
mrleinad
"Although the information was transmitted in real time, the learning process
was not instantaneous. [It] takes about 45 days of training an hour a day,
said Prof Nicolelis."

So much for Matrix learning style.

