
BrainNet: A Multi-Person Brain-To-Brain Interface for Direct Collaboration - legionof7
https://arxiv.org/abs/1809.08632v1
======
allenleein
I'm developing a theory about brain's network called: Ghosts in the Shell.

I'm trying to decentralizing my brain to a network of brains.

Ghost in the Shell (Japanese: 攻殻機動隊 シェルのゴースト) is a 1989 Japanese media
franchise originally published as a seinen manga series of the same name
written and illustrated by Masamune Shirow.

In that post-cyberpunk iteration of a possible future, computer technology has
advanced to the point that many members of the public possess cyberbrains,
technology that allows them to interface their biological brain with various
networks. The level of cyberization varies from simple minimal interfaces to
almost complete replacement of the brain with cybernetic parts, in cases of
severe trauma. This can also be combined with various levels of prostheses,
with a fully prosthetic body enabling a person to become a cyborg.

We are still way far from that kind of individual advanced tech in 2018,
though we are slowly becoming the cyborg with a supercomputer in our pocket.
With the sheer amount of information available in the world today, we have
become overwhelmed.

Our mind is in a constant racing state.

Inspired by Ghost in the Shell, I believe the future of human thinking is to
decentralize ourselves to a thinking network like "GHOSTS in the Shell". I
think we should try to turn our brain into composable and functional brains
network by starting to view everything as a function and iterate it like a
machine, together.

We need to elevate our mind in this digital age. Not only as human, but also
as human + machine.

([https://github.com/allenleein/brains/wiki](https://github.com/allenleein/brains/wiki))

~~~
otakucode
Perhaps investigate the historical response of human societies to the
integration of technological advancement into arenas of life they see as
"fundamentally human". It's not a pretty history.

~~~
allenleein
> It's not a pretty history.

So we are here to change it to pretty one.

------
Gys
So I wonder if some people are more compatible, better readable, connectable,
solvers etc then others. Then I can imagine an industry around people that can
rent out their brain remotely to help solve problems. Kind of an AirBnB for
brains ;-)

~~~
v_lisivka
It's better to just connect to computer and use it as additional long therm
memory and quick problem solver. It's very helpful when you have problem to
hard to your own brain to solve, then think about or do something else, then
have ready to use solution in the short-term memory - you just know it.
However, intermediate steps from problem to solution are omitted, of course,
because they are performed by separate thing.

It's good for quick solving of problems, but bad for learning. It's like using
of calculator for math: calculator shows you answer quickly, but you learn
nothing. Such interface will allow to effectively use persons with average IQ
for problem solving, like calculators allows to do math easily for just
anybody, but it will not generate geniuses.

However, such direct interaction with computer or an other person can lead to
schizophrenia, so it must be clearly marked: "here are my own memories, here
are memories implanted by the computer", which is not an easy thing to do.
Sometimes we cannot separate even our own dreams from our own memories. If
such system will be used in wrong way, you will need _years_ to recover, to
forget badly implanted memories.

~~~
taneq
> Such interface will allow to effectively use persons with average IQ for
> problem solving, like calculators allows to do math easily for just anybody,
> but it will not generate geniuses.

Motorbikes don't generate world class sprinters either but they sure get you
from A to B faster than walking.

------
ArtWomb
DARPA's Next-gen Non-surgical Neuro-tech (N3) program wants brain interfaces
for drone pilots

[https://spectrum.ieee.org/the-human-
os/biomedical/bionics/da...](https://spectrum.ieee.org/the-human-
os/biomedical/bionics/darpa-wants-brain-interfaces-for-able-bodied-
warfighters)

But I think one interesting peacetime application focuses on brain training
and deep meditation. One barrier to achieving trancelike states of flow and
concentration is overcoming the constant involuntary barrage of "noise".
Having a feedback loop may assist in getting there.

And in geriatric medicine, keeping the mind sharp is seen to improve longevity
and quality of life. Designing custom games to improve neuroplasticity in the
aging brain could allow for independence well into late retirement.

------
seiferteric
This is awesome, but am I wrong in thinking that EEG + TMS is a pretty low
bandwidth signal? Not sure how useful this will ever be for "cooperative
problem solving by humans using a "social network" of connected brains" as the
article mentions.

~~~
MrLeap
I'd be interested to know an estimated bitrate. Texting is a very low
bandwidth signal that has a lot of utility. If it's comparable to what a
person can do with two thumbs I can imagine some really awesome things I could
create with this tech.

I wish I could justify purchasing the hardware, or had an audience. Haha.

------
hour_glass
The TMS product they are using to stimulate the receivers brain is pretty
interesting.

[https://www.magstim.com/products/](https://www.magstim.com/products/)

I'm pretty skeptical that you could use magnetism to transfer any sort of
useful information into someone's brain. I'm guessing it just gives a
sensation, or maybe some vague visual effect since it is used on the occipital
lobe in this experiment.

~~~
amelius
Question: assume you have a closed hull around some domain D, and record the
electric and magnetic fields E and H at the entire hull; at a later moment you
prescribe E and H at the hull; will this setup regenerate the original values
of E and H _inside_ the domain D?

~~~
ckemere
To maybe short-circuit your idea, magneto-encephalography (MEG) uses magnetic
field sensors to sense brain activity. Coherent currents from ~100K neurons
produce femto-tesla signals (about 1 billionth of the earth's magnetic field).
This currently requires liquid helium cooled superconductors (though there is
some work trying on nanofabricated room-temperature ultraprecise field
sensors). To induce effects, the TMS magnets produce reasonable fractions of a
Tesla.

------
carapace
Okay, this is crazy cool, but _you don 't need hardware to do this!_

Now that you know this is possible, go read about Charles Tart's mutual
hypnosis experiment:
[https://s3.amazonaws.com/cttart/articles/april2013articles/P...](https://s3.amazonaws.com/cttart/articles/april2013articles/Psychedelic+Experiences+Associated+with+a+Novel+Hypnosis+Procedure+Mutual+Hypnosis.pdf)

That fact that they were able to nail down this kind of communication using
the hardware and setup that they did is an existence proof that this sort of
communication and synchronization are possible (and not that difficult to
establish), however, we are already equipped with sophisticated sensory and
signalling systems, more than capable of supporting this sort of thing without
extra hardware.

~~~
Gys
One benefit of using hardware is not all persons have to be in the same
location: "The Senders' decisions are transmitted via the Internet to the
brain of a third subject, the "Receiver," who cannot see the game screen."

~~~
carapace
That's a good point, and it immediately makes me wonder what kind of e.g.
camera and monitor might be enough to enable remote synchronization between
brains. There are so many options. You could transmit signals carrying things
like pulse and breath rate as well as visual and audio.

My sense of urgency comes from my desire that folks reading this on HN realize
that the "magic sauce" is in the _brains_ not the comm channel.

I've been trying to get traction on this idea for years: if you want a mind-
machine interface you keep the hardware simple and let the _most advanced
processor_ do the tough part of the job. (In case it's not crystal clear, the
most advance processor in ANY technological system is the one between your
ears, eh?) Metaphorically, the hardware should be seen as one of those cheap,
dumb modems that offload most of the work to the CPU on the motherboard.

With a little bit of hypnosis and the crudest of GSR (galvanic skin response)
sensors you can make a brain-machine interface at home that would blow your
mind (not literally.)

Anticipating the next question: "If it's so easy why isn't everyone doing it?"

Well, smart-alec, it's because it requires a major adjustment of one's self-
image, that most people can't handle or even imagine, so (for example) it
could never become a mass-market consumer product.

We have had the technology for decades. The limiting factor is belief. That's
why this "BrainNet" is exciting: not because they used such a crude
interconnect, but because they did it in a way that opens a wedge for the idea
that this stuff is possible at all. As soon as you grasp that the doors fling
open to all sorts of fantastic possibilities.

(And pitfalls: you and your brain are not the same thing and will occasionally
find yourselves working at cross-purposes. At times like that you'll probably
NOT want your brain to be directly wired into your "smart" home _and_ have all
your passwords, eh? Like I said, this requires a major adjustment of one's
self-image.)

------
sgentle
Lightly edited extract from the paper, for anyone curious about the method:

BrainNet relies on two well-known technologies: Electroencephalography (EEG)
for non-invasively recording brain signals from scalp and transcranial
magnetic stimulation (TMS) for non-invasively stimulating the visual cortex.
The Senders convey their decisions of "rotate" or "do not rotate" by
controlling a horizontally moving cursor using steady-state visually-evoked
potentials (SSVEPs): to convey a "rotate" decision, Senders focused their
attention on a "Yes" LED light flashing at 17 Hz placed on one side of their
computer screen; to convey a "do not rotate" decision, they focused on the
“No” LED light flashing at 15 Hz placed on the other side.

The direction of movement of the cursor was determined by comparing the EEG
power at 17 Hz versus 15 Hz, with a higher power at 17 Hz over that at 15 Hz
moving the cursor towards the side near the “Yes” light, and vice versa for
the "No" light. A "rotate" ("do not rotate") decision was made when the cursor
hit the "YES" ("NO") side of the screen. In trials where the cursor did not
reach either side of the screen due to trial time elapsing, the decision
closest to the last location of the cursor was chosen as the subject’s
decision.

The decisions of the two Senders were sent to the Receiver’s computer through
a TCP/IP network and were further translated into single pulses of
transcranial magentic stimulation (TMS) delivered to the occipital cortex of
the Receiver. The intensity of the stimulation was set above or below the
threshold at which the Receiver will perceive a flash of light known as a
phosphene: a “Yes” response was translated to an intensity above the
threshold, and “No” was translated to an intensity below the threshold. The
Receiver made his/her decision based on whether a phosphene was perceived and
this decision was conveyed to the game by the Receiver using the same SSVEP
procedure used by both Senders.

------
logronoide
Can’t wait to have one of this for our weekly code reviews.

------
r41nbowdash
>8-channel Cyton system

that's the interesting bit, because they got the results on a portable,
consumer-grade device

------
baldeagle
For an example of this in science fiction, check out the Nexus series by Ramez
Naam.

~~~
minkzilla
I've had that book sitting on my shelf for a couple years now. I read the
first couple of chapters, but the writing seemed poor and I couldn't really
get into it. Is it worth giving it another go?

~~~
baldeagle
yes, very much so if you want an action movie / philosophy story about linking
minds like boxes on the internet. I think it is some of the most forward
leaning scifi I've read in a while, but it does have a strong political bend
to it.

------
moonbug
"speech"?

~~~
oelmekki
I don't know if it was intended to be tongue in the cheek from you, but this
is stereotypical dropbox-like HN comment.

"\- We found a way to enable simple telepathy.

"\- Not impressed. I can already communicate by speaking."

If it was indeed intended, thanks a lot, it made my day :)

~~~
red75prime
I am afraid this simple telepathy is of the same type of impressiveness as
SHRDLU (AI program) or acoustic levitation.

It is impressive, but underlying approach seems to be hard to scale in certain
directions (bandwidth, model world complexity, payload mass). So, no fast
progress should be expected.

------
aussieguy1234
You will be assimilated

