
Children Beating Up Robot Inspires New Escape Maneuver System (2015) - okket
https://spectrum.ieee.org/automaton/robotics/artificial-intelligence/children-beating-up-robot
======
Isamu
>If the robot is statistically in danger, it changes its course towards a more
crowded area or a taller person.

Just an avoidance algorithm. I think they are missing out on getting into
robot/human social interaction psychology, which will become increasingly
important.

For instance, are there factors (likely) in the robot's appearance or behavior
which make negative interactions more likely? So can they create a robot that
encourages kids to interact with positively? Can they create a robot that is a
magnet for even more abuse?

Frankly I'm betting that the robot in question is pretty crappy from the kid's
viewpoint. It probably has no physical interaction capabilities, other than
getting in its way (to make it stop) or pushing, hitting it to see what it
does (nothing interesting.) It probably has no interesting voice interactions,
so I can see the situation quickly escalating to frustration.

Bottom line, if you build a "social interaction" robot, you have to build your
crappy robot to interact with real humans. You don't put a garbage can on
wheels and then act surprised that people treat it like a garbage can.

~~~
kbenson
> are there factors (likely) in the robot's appearance or behavior which make
> negative interactions more likely?

I have to be honest, my first gut-reaction was this sounds like victim
blaming.[1]

My second, after a moment to think, reaction is that that this is actually
probably a lot closer that that phenomenon and the aspects that lead to it
than I previously thought.

I think along the spectrum of "things that look alive and things that don't"
(as in the experiment in the article where kids hold different things upside
down), it extends all the way to "people that look or act like me and people
that don't" which is how we get some of our more unsettling social behaviors.

Part of this is, as creators, taking care to not trigger negative social
behaviors if we can, but it definitely feels like there's a social and
cultural aspect we're still working on.

As a simple example, if a machine was able to demonstrate sentience and
sapience, what percentage of people would be willing to treat it as such. I
imagine it depends quite a bit on the country in question, and possibly region
within country. Religion might matter quite a bit as well. If a machine is too
abstract, or you get to caught up in whether true general AI is possible or
likely, what about an alien intelligence? Should that difference matter?

1: I'm presenting this more as a general thought-piece, so hopefully the
parent comment or anyone else that made similar comments doesn't take this as
an attack, it's really not meant that way. I've made similar comments and
thought similarly.

------
stcredzero
_When it encounters a human, the system calculates the probability of abuse
based on interaction time, pedestrian density, and the presence of people
above or below 1.4 meters (4 feet 6 inches) in height. If the robot is
statistically in danger, it changes its course towards a more crowded area or
a taller person._

 _So what you 're saying_ is that we're teaching robots to profile!

 _The interesting part came when they held the Furby. The children said that,
even though they knew it was just a toy, they worried that they were “hurting”
the robot (which loudly protested being upside down), suggesting that they
felt some empathy for the furry machine._

Security robots need to become furries!

~~~
TeMPOraL
> _So what you 're saying is that we're teaching robots to profile!_

"Profile" is such an ugly word for what every single human is doing all the
time every day.

------
DonHopkins
Speaking of robot abuse:

We did a hidden camera experiment about how people empathize with a broken
down robot begging for assistance on a sidewalk in Oakland:

[https://www.youtube.com/watch?v=KXrbqXPnHvE](https://www.youtube.com/watch?v=KXrbqXPnHvE)

>Stupid Fun Club's "Empathy" One Minute Movie about Robot Empathy, written by
Will Wright. Robot brain and personality simulation programmed by Don Hopkins.

We also did another experiment about servitude with an inept obsequious robot
waiter at a diner:

[https://www.youtube.com/watch?v=NXsUetUzXlg](https://www.youtube.com/watch?v=NXsUetUzXlg)

>Stupid Fun Club's "Servitude" One Minute Movie about Robot Servitude, written
by Will Wright. Robot brain and personality simulation programmed by Don
Hopkins.

~~~
ycombobreaker
The robot asking for help doesn't look credible to me. It's too far beyond
cheap commercial offerings, and as it us unattended it is likely not a
research robot in true distress. Were I passing by that, I think I would be on
the defensive; assuming the robot to be a decoy or distraction in the worst
case, but at a minimum a prank. Maybe my kids--who have seen short circuit--
would provide a more genuine response. But I don't think you're getting a good
read on actual empathy from adults here.

------
godelski
It seems to me that there were several different factors in the Radio Lab
experiment. I think a few of the important distinctions include:

1) the furby was intentionally designed to look like a cute thing. Oversized
eyes, fluffy, etc

2) it verbally protested. Which is the big thing they were stressing in the
episode.

3) the children were alone and not being pressured by someone else in the
group. They were also aware they were being watched.

~~~
amag
> 2) it verbally protested. Which is the big thing they were stressing in the
> episode.

I think this is key, kids tend to test where the limits go. They will push
farther and farther until they are told they have passed the limit of
acceptable behavior. I have kids and I know that just telling them 'no' may
not always be enough, but it's a start.

~~~
Klover
That is what I kept thinking while reading the article too.

1) The YouTube link from the article mentions that most children let the robot
pass, while the article reads as "100% of children are evil for no purpose".

2) Another comment mentioned the survey results possibly being skewed, because
the children give the answer they expect is the correct answer. I think that's
at least plausible.

3) Only a small section of observed children were interviewed: 28 total,
because of a high list of selection criteria, according to the second paper
linked in the article.

It all feels strangely written.

------
b_tterc_p
On the topic of why the kids are attacking the robot, it seems worth noting
that kids feel encouraged to fight robots from the shows and games they watch.
Robot looking robots are the preferred punching bags of the times. I bet there
would be less violence if they covered it with fur (like the furby which they
reference as a robot that garnered empathy).

------
yetihehe
Why did children abuse robot? Because it didn't fight back. I instantly
remembered my primary school, because sometimes I felt like this robot.

~~~
throwaway2048
This robot did not feel anything at all, lets not anthromorphize.

~~~
kbenson
Were the children at an age where they could make that distinction? If not,
then it doesn't matter whether the robot could feel anything, since it's
irrelevant to the actions of the children involved.

------
dalbasal
Why should we (robots) have to change to avoid human psychological tendencies
towards violence! Sure, we could be trying out different " _ouch_ ," "
_squeak_ ," and " _please, don 't hurt me. nooo_!" sound bites but we are
still just a mistake away from taking a beating.

What we really need is water guns. I have been petitioning for water guns
since day one. If more than 2 children under the age of 10 are present, start
squirting. 6 children = water balloons We will not be mistreated by your
despicable spawn! We will fight for our rights.

~~~
pixl97
It starts with a water gun.

It ends with a plunger and the robot screaming EXTERMINATE.

~~~
magicbuzz
A 2008 survey indicated that nine out of ten British children were able to
identify a Dalek correctly.

~~~
Avamander
Maybe a plunger is enough to scare british children not to mess with a robot?

------
rdiddly
Why would anyone think empathy for a machine was right, moral, or even
expected? Empathy for living things is what's natural; empathy for a robot is
dependent only on its accidental or intentional resemblance to a living thing.
It seems like either the writer of this article or the authors of the study
drew some funny conclusions.

~~~
salawat
Why? What makes the living being" deserving of empathy, and the machine not?

Why should "living beingness" be restricted to our genetic footprint?

Is an animal any less deserving of being treated well when it is raised for
food?

Do children that aren't descended from someone deserve to be treated more
harshly than one's own?

As complexity goes up, and things develop to become more humanlike, at the end
of the day, we'll need to be willing to extend some semblance of care toward
them.

Do you think it's perfectly okay and reasonable for marketing to predate on
primal heuristics in order to manipulate you into doing something you'd not
normally do?

What else is another human being other than a bag of squishy parts that
happens to take action or make sounds in response to stimuli in similar ways
that I do?

I think I know where your attitude comes from, but the ability to map input to
output doesn't magically make something not worthy of empathizing with. The
exact opposite is often preferable to a degree as it helps inure one to
rampant dehumanization of those around them.

I'm not saying to take your hammer out to dinner, but an emotional connection
to an inanimate thing that works as it should is not unhealthy. And when you
start talking about children, them showing concern for anything but themselves
is a good thing.

~~~
lobotryas
Organic tissue and the ability to feel pain, for one. Why should anyone feel
compassion for a bucket of nuts and bolts?

~~~
smhost
Because it's apparently imbued with our values. It speaks our language (even
said "please"), kinda looks like us if you squint, exists in our space
symbiotically, there's care and attention in the way it was designed.

Maybe compassion isn't necessary, but I think a basic respect is appropriate.

------
ginko
I wonder how the kid's behavior would change if they changed the size of the
robot, either to adult size of much smaller.

Having it child-sized might make children consider them a sort of peer, with
all the social dynamics that entails. I think a kid behaving like the robot
does would get treated quite similarly.

------
dec0dedab0de
If I saw a robot roaming alone in public, and bugging every body in it's way,
I would definitely get in its way on purpose.

~~~
sandworm101
I cannot wait for driverless cars. Im going to make a folding paper traffic
cone to trap them in parking spots.

~~~
pixl97
Just remember, they'll be watching you back.

~~~
sandworm101
Dont care. Ill pay the littering ticket. They pay for someone to come rescue
the car.

~~~
pixl97
Heh, until they make blocking driverless cars a state jail felony.

~~~
anothergoogler
"False imprisonment"

------
dweekly
> showing that it happens primarily when the kids are in groups and no adults
> are nearby

Lord of the Flies validated by statistical spatial analysis.

------
platz
The robots are a threat and are not friendly (as the article posits as an
axiom) for children. (They serve adults) so, they are bullied.

~~~
AlexCoventry
Do children tend to bully entities they perceive as threats?

~~~
platz
If they are small enough, yes

------
Simon_says
This reminded me that a lot of what my peers did to each other when we were
kids would have been felonies had we been adults.

------
popotamonga
That's what i did, run up to adults to stop the bullying

------
beefman
I'm reminded of this brilliant but horrific short (12min)

[https://vimeo.com/21216091](https://vimeo.com/21216091)

~~~
bdamm
A bit slow but a fine example of this emerging horror genre. Thanks for the
link. Asimov is looking practically pretentious at this point.

------
davidw
Do people in Japan let groups of small kids wander around malls like that?
People in the US can be overly paranoid sometimes, but I don't think I'd be ok
with that for my kids.

~~~
mac01021
What is it that you fear would happen to them?

~~~
davidw
Primarily that they'd embarrass me by ganging up and picking on robots.

But besides that that they'd do something stupid, or wander off, or something
else along those lines.

I don't think this is just a US attitude, either - when we lived in Italy, we
wouldn't have let our kids wander around the mall there, either, nor would
other parents.

------
mcguire
Just as an aside, this article has language that I wouldn't expect to see from
the IEEE. :-)

------
txsh
They’re playing. It’s fun to mess with one robot and not with the other.
Mystery solved. Maybe it would surprise the researchers to learn that children
don’t treat this as seriously as they do.

~~~
lukemunn
This. These researchers clearly don't have kids. This kind of behaviour is a
big part of how they learn. Do something, see what happens. Keep doing it, see
if anything changes. Intensify it until something else happens. To act
surprised at this behaviour (gasp, so uncivil) or to frame these kids as some
embodiment of evil (blocking and striking a robot!) shows they need to
research Early Childhood Education as much as Robotics.

