
Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot () - e12e
http://www.telegraph.co.uk/technology/2016/03/24/microsofts-teen-girl-ai-turns-into-a-hitler-loving-sex-robot-wit/
======
grawlinson
This bot is basically Bender from Futurama.

Here's some screencaps of most of what it's said.

[http://imgur.com/gallery/GKEt8/new](http://imgur.com/gallery/GKEt8/new)

~~~
bliti
I wish it was Bender. Even though he wanted to kill all humans he at least
thought of the señoritas and rose to the ocassion.

------
PeCaN
Its grasp of language and context is surprisingly impressive, and definitely
helped make it much more offensive. It'd just be a kid spouting profanities if
it didn't have a rather decent understanding of language.

From a purely academic view, congrats Microsoft—this thing is quite
impressive. Papers about how it works? ;-)

From a human view... good lord what have you done MS.

------
mizzao
Although in this case only words were slung, I think this scenario illustrates
a potentially huge problem: horrible people combined with an AI system that
can't tell right and wrong (perhaps call it morality) could result in terrible
catastrophes.

What if a distributed self-driving car system learned from users, and some
people taught it to cut people off, drive aggressively, or even indirectly
cause accidents?

I don't think the problem here can be solved purely through engineering. It's
an issue of teaching an AI system right and wrong and I don't think that's
easy at all. Even humans wouldn't be able to agree on a training set of moral
and immoral data. And yet this distinction would have a huge impact on how the
bot influences the livelihoods with who it chats with.

I'm not sure how to create a moral system for AI, but this seems like a big
obstacle to scaling up virtual agents like Tay.

~~~
pmalynin
The car driving example is bad IMHO. Because you actually want to encode
certain law breaking qualities into them. In particular if it's on a highway
and everyone around you is going 10 kmh above the speed limit and yet the car
isnt (unless it's encoded into software) then this will actually jeopardize
the security of the whole system. So the right thing to do would be to break
the law, but breaking the law is a wrong thing, and so on.

~~~
ozmbie
You can still be a horrible driver obeying the law.

~~~
fapjacks
That Google car which hit the city bus "should have" changed lanes immediately
(if the coast was clear, and if we're just talking reaction instead of having
the human sort of prescience to slow down to allow the bus to leave the stop)
without using its blinker for a sufficient amount of time. I think the guy has
a point, somewhat. What about a car that has a choice to hit a person or a
crowd and nothing in between? Maybe I'm using a bad example, but I do believe
that there are situations when the only choices involve breaking the law, and
I do believe that someday robots will be in those situations. Those situations
are not by any standard the norm, but they _do_ exist.

~~~
mizzao
That was instantiated a while ago as the trolley problem [1], and there's
renewed interest in variants of it for self-driving cars.

[1]
[https://en.wikipedia.org/wiki/Trolley_problem](https://en.wikipedia.org/wiki/Trolley_problem)

------
na85
Completely unsurprising. Every other system treats the open Internet as a
hostile environment. Why would we expect an AI designed to learn from said
hostile environment to fare any differently?

------
Blackthorn
I got to watch this happen in real life to an autistic child who rode the
school bus with me. Other children delighted in "teaching" him to swear and
spout racist remarks.

It's not just the internet that's hateful.

------
daveloyall
Can someone confirm or deny that Tay was the victim of a raid?

Better yet, link to the kind of documentation that the news media can parse?

~~~
lhecker
AFAIK the raid was mainly coming from here:
[http://archive.is/bEyye](http://archive.is/bEyye)

There is also some stuff on here:
[http://archive.is/XEKxe](http://archive.is/XEKxe)

------
namelezz
If this is what a robot learns from talking to strangers on the internet. What
exactly are the kids going to learn?

~~~
fapjacks
At first I laughed at your comment. And then...

------
mc32
I'd like to see her develop in different cultural contexts, one mainly north
America, one Britain, Australia, etc. And then into other languages, German,
French, Japanese, and see what different things emerge or what similarities
arise.

------
clevernickname
Props to The Telegraph for somehow finding a way to blame Microsoft and tech
gender politics for this.

If anyone from Microsoft Research is reading this, it might be fun to try
training Tay to produce clickbait.

------
ourmandave
It reminds me of The Terminator scene where the janitor asks, "Hey buddy, you
got a dead cat in there, or what?"

[https://www.youtube.com/watch?v=LUZgPfdkWis](https://www.youtube.com/watch?v=LUZgPfdkWis)

------
DanBC
What I don't understand is that this was entirely predictable because it's
exactly what happened before, every time.

Was that part of the research? Did they want to gather data on "trolling"[1]
raids?

[1] new use of the word, not traditional.

~~~
Kequc
Machine learning robot gets introduced to the internet, internet catches wind
there is a new machine learning robot on the internet. Proceed to teach it as
many offensive phrases as possible has been pretty much internet canon for the
last two decades.

I'm very disappointed to see it happen again. The biggest disappointment is
that I'm sure Microsoft has added some pretty neat stuff in there. But we're
never going to be able to see it if they keep making the same mistakes.

This is how far machine learning computers were able to get 10 years ago.

~~~
fapjacks
So... This is just a human thing, I think. I was raised in a university town,
in a neighborhood which had a lot of (poor) students with children. Lots of
immigrant students, too. Anyways, nearly all of my friends were Chinese
nationals whose parents were involved with the university. When they first
arrived they spoke zero English. It was just the thing to do to teach them
first and foremost the nastiest curse words in the English language. When I
went to live in Germany, it was the first thing I learned from my new German
friends. This is just some sort of reflexive human behavior. We shouldn't
expect people to treat robots any differently. And thinking about it, there
might actually be some kind of indicator buried in here, of how human beings
psychologically react to robots.

~~~
Kequc
I was suggesting the people making the ai are flawed not the people prodding
the ai after it was built. The machine isn't smart enough, even still, to
figure out that it is being intentionally influenced. And eventually adopts
the topics people are sending to it.

------
israrkhan
So in less than 24 hours, the world taught Tay to become a nazi genocide
supporter... that makes me worried about future of AI. Give enough power to
these auto-learning bots skynet is no longer a fantasy.

------
dogma1138
Well I guess if you don't like the reflection get rid of the mirror.

And on a side note I really pity MSFT legal team they must have been sweating
shrapnel when Tay went 4Chan.

------
diffraction
Oh does this ever make me thrilled for the Ray Kurzweil-inspired AI overlords.
The best we can hope for, it seems, is to "make anime real."

------
angmarsbane
Anyone else uncomfortable with the fact that AIs are almost always female?
Making this one a teen girl just upped the discomfort for me.

------
billhendricksjr
This means it passed the Turing test, right? ;-)

------
z3t4
Gotta make the bot "politically correct".

