
AI-controlled brain implants for mood disorders tested in people - doener
http://www.nature.com/news/ai-controlled-brain-implants-for-mood-disorders-tested-in-people-1.23031
======
Veedrac
> One challenge with stimulating areas of the brain associated with mood, he
> says, is the possibility of overcorrecting emotions to create extreme
> happiness that overwhelms all other feelings.

Terrifyingly, this risk does not seem far off the wireheading described in
gwern's article, Terrorism is not Effective.

> I have just laid out a scheme whereby agents extraordinary only in
> dedication have exerted world-shaking power. Similar scenarios are true of
> other sectors. (The Secret Service works hard, but can they protect the
> President against the 100 fanatics?) Destruction and offense is always
> easier than construction and defense, but it’s hard to see why the fanatic
> advantage would be completely negated in constructive enterprises. (Small
> groups of programmers and engineers routinely revolutionize sectors of
> technology, without being especially fanatical.) But of course, we see very
> few such schemes in either direction. That is the point. There is a very
> large gap between what we can do and what we will do. Coordination is
> extremely hard (see again the principal-agent problem).

> But the scary thought is - will things remain that way? I have been at pains
> to keep the agents ordinary. Is there any way now or in the future to create
> such agents? [...]

> In short, is there any reason to believe wireheading will not work in humans
> like it works in mice? [...] That is one scenario. Here is another: the
> electrode is under the control of a program connected to metrics chosen by
> the subject, like going to the gym. (Related topic: nicotine & habit-
> formation.) The incentives are much more closely aligned: the subject could
> gain control of the stimulation, but that would frustrate another goal of
> his (going to the gym). Imagine the program hooked up to a comprehensive
> plan for attacking Goldman Sachs; one rather doubts that an agent will break
> the plan and not eat bulgur pilaf if that means he is simultaneously
> sabotaging the plan and also depriving himself of pleasure.

[http://www.gwern.net/Terrorism-is-not-
Effective](http://www.gwern.net/Terrorism-is-not-Effective)

~~~
fao_
Off topic:

> (Small groups of programmers and engineers routinely revolutionize sectors
> of technology, without being especially fanatical.)

In my opinion, that's because of the current state of software as a field. In
the future, as the field grows and the distance between subfields grows, it's
going to be more and more difficult to get stuff done.

You could argue that abstraction will offset this difference, but I doubt
that, as you generally need people who are invested in those layers of
abstraction to fix bugs. Occasional bugs will be fixable by the small team,
but already we're outsourcing Big Bugs to other, more specialized teams, even
if we don't realize that it is so (Usage of libraries is just outsourcing work
to other 'research teams'). But then I guess the question is that if a bug fix
for team X utilizes seven different teams that are unconnected to the project
worked on by team X, then does it still count as a 'small group', at what
point do you account for the critical-yet-unconnected effort of other teams?

------
aptwebapps
Reminds me of
[https://en.wikipedia.org/wiki/The_Terminal_Man](https://en.wikipedia.org/wiki/The_Terminal_Man)

~~~
YouAreGreat
Also
[https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky](https://en.wikipedia.org/wiki/A_Deepness_in_the_Sky)

> when a person began failing or slowing at a set task ... they were able to
> reverse it

Vernor Vinge anticipated this treatment and called it "focus".

------
unexistance
I've always thought whenever DARPA funded something sounds good, there's a
catch, like "yeah, let's publish the good-natured stuff... " and in this case,
"make sure the implants can be used in other part of the brain"

Or am I just delusional? Or it's just the nature of tools?

~~~
zitterbewegung
Yes, you are delusional. How do you define new technology that is "good
natured"?

The reason DARPA is studying this it seems is to reduce its casualty rate to
research new ways to treat mental illness. If you can do that you will spend
less and also keep people alive. A bunch of medical research has been created
for this motivation.

ARPA created the prototype to the internet. (ARPANet). The catch of the
internet is that it allowed for people to disseminate information faster than
anything before. You get the good (advancement of science) and the bad
(misleading people).

DARPA is in the business of moonshots. They are all about high risk / high
reward.

Its more likely the stuff you don't hear about didn't work in the first place.

~~~
unexistance
I have exactly ARPAnet in mind when I said "Or it's just the nature of tools?"
:D

I should've worded it better, good natured here means good-intention, in which
there will be bad intention that we will probably NEVER see (C&D, military
secret, etc...)

In this case, what's stopping the scientists and/or funder from
misusing/weaponizing this implant?

p/s: guess I'm biased when it's DARPA, since it's US military complex... but
hey, I'm enjoying the Internet :D

~~~
zitterbewegung
For one thing how would you weaponize this? Do you think that scientists are
robots and don't have their own morality?

You still haven't told me how to design a technology to be "good intentioned".
Technology has no intension it's the people who use it.

It's really disturbing that you automatically assume that technology is evil
because who created it and you have a prejudice because it is DARPA.

~~~
erikpukinskis
> You still haven't told me how to design a technology to be "good
> intentioned".

Design.

Design simply encodes the values of the designer in the physical technology. A
virtuous designer will design something which makes good things easy. A
nihilistic designer will design something which makes their life easy. It’s
unavoidable.

That’s why so much software these days is harmful, the tech world is largely
populated with supremacists and nihilists.

(I don’t mean that in any particularly judgemental way. Supremacism is the
norm in the wealthier slices of American culture, so it’s unremarkable that
Silicon Valley would end up reflecting that. And the competitive nature of
funding and hiring self selects for supremacists as well. And for those who
don’t tend towards supremacism, nihilism one of the few ways to stay sane
amongst supremacists.)

------
zghst
Just imagine a future where this technology is stolen and used to control
dissent...

Coming to a future dystopia near you

~~~
amiga-workbench
Imagine a future when this technology is proprietary.

~~~
mikro2nd
Imagine a future where the technology triggers good feelings in you when you
view certain products or depictions thereof.

------
YouAreGreat
The logical endpoint of the "you don't really own your
smartphone/notebook/computer/IoT-device/etc" thread running through much of HN
discussion:

> One challenge with stimulating areas of the brain associated with mood, he
> says, is the possibility of ... extreme happiness

Extreme happiness you'd be able to feel if _you_ were in control of your "own"
implants which in turn are in control of your "own" brain.

Needless to say, you wouldn't be.

------
reubeniv
This story is ripe for conspiracy theories

[http://edition.cnn.com/2013/03/16/health/mental-illness-
over...](http://edition.cnn.com/2013/03/16/health/mental-illness-
overdiagnosis/index.html)

------
novaleaf
Courtesy of Black Mirror:
[https://en.wikipedia.org/wiki/Men_Against_Fire](https://en.wikipedia.org/wiki/Men_Against_Fire)

------
amelius
This makes sense. The human body consists of many biological control systems,
and if few of them are out of order, then the human will become ill. So add an
artificial control system and you might have a fix.

However, as humans evolve, we may become dependent on this type of technology.
That's what worries me. On the other hand, patents will expire within one or
two generations.

------
dkersten
How long do they last?

It was my understanding that a big problem with brain implants (both probes
and stimulators) was that after some time the body rejects them. Is that true
here too and they have limited lifetimes, or have they got around that somehow
(or is it otherwise a non-issue for whatever reason)?

------
silverlight
Reminds me of The Terminal Man by Michael Crichton. Just re read it the other
day and although it’s fiction and overblown, it is crazy to see the future
catch up to what was science fiction 30 years ago.

------
GarvielLoken
No

------
snowpanda
What could possible go wrong? It's just your brain.

------
timthelion
Imagine a future in which every North Korean has this tech and is actually
truly happier in NK than the west. And NK starts sending these people out to
actually "follow the generals start and bring the Juch idea to the world."
Now, we bank on NK not wanting their countrymen to discover how much more
technologically advanced the west is.

~~~
jdavis703
I'm not sure happiness is that simple. It's "easy" to fix depression in
someone if they're well fed, in good physical health, and aren't overly
concerned about their physical safety (I'm not trying to minimize depression
here, just pointing out it's relative problems). However DPRK soldiers
apparently aren't fed that well, lack access to good healthcare and above all
they'd be fighting on a battlefield seeing death around them and in constant
fear of their life. It might be possible to amp up the soldiers, but the
Germans also drugged their troops and even military leadership during WW II
which didn't work out so well for them in the medium term.

~~~
timthelion
Why do you say it didn't work out well for the germans?

~~~
jdavis703
Because they lost WW II and had their country occupied for decades by
Americans and Russians, both of whom many Germans resented (at least according
to my parents who were part of the occupying force).

~~~
timthelion
But did they lose because of the drugs or because they were massively
outnumbered and made ridiculously stupid mistakes on the eastern front and ran
out of fuel and didn't have nukes in time?

