
Google contractors reportedly targeted homeless people for facial recognition - jamesgagan
https://www.theverge.com/2019/10/2/20896181/google-contractor-reportedly-targeted-homeless-people-for-pixel-4-facial-recognition
======
username90
I remember the outrage when people discovered that Google's AI wasn't properly
trained on black faces. It makes sense that they try hard to avoid that
happening again by paying black people to let Google scan their faces. It is
not unethical to try to diversify your training data.

[https://www.telegraph.co.uk/technology/google/11710136/Googl...](https://www.telegraph.co.uk/technology/google/11710136/Google-
Photos-assigns-gorilla-tag-to-photos-of-black-people.html)

Anyway, this part sounds directly illegal, seems like it was just Randstad
being greedy but if anyone from Google knew about it then it is bad but I
doubt that they couldn't budget enough money to get the scans legally:

> They said Randstad project leaders specifically told the TVCs to (...)
> conceal the fact that people’s faces were being recorded and even lie to
> maximize their data collections.

[https://www.nydailynews.com/news/national/ny-google-
darker-s...](https://www.nydailynews.com/news/national/ny-google-darker-skin-
tones-facial-recognition-pixel-20191002-5vxpgowknffnvbmy5eg7epsf34-story.html)

~~~
jeromebaek
There is a premise in this deduction, which Randstad made: 1\. We need darker
faces in our training data. 2\. Therefore, gather training data from homeless
people.

How do you go from 1 to 2? With the premise "darker-faced people tend to be
homeless".

This is not necessarily a false premise -- statistically, it is true, and it
is a reflection of systemic injustice -- but the outrage is not whether it's
true or false; the outrage is that Randstad _exploited_ this painful fact.

~~~
fiter
In the article it said Randstad targeted homeless people because they were
less likely to talk to the media.

~~~
ehsankia
Also I'm assuming homeless people would be much happier about a 5$ gift card
on average.

------
kauffj
Kudos to Google's contractor for offering this opportunity to the people who
need it most.

I would happily sell anyone a picture or scan of my face for $5. But I would
even more happily have that chance go to someone who needs it more than
myself.

This article also mentions that the contractor may have lied to or misled the
homeless, which is deplorable. But the behavior described by the title itself
is nothing objectionable. The fact that many will object is a phenomenon I've
seen called "Copenhagen Ethics": [https://blog.jaibot.com/the-copenhagen-
interpretation-of-eth...](https://blog.jaibot.com/the-copenhagen-
interpretation-of-ethics/)

~~~
tobib
> I would happily sell anyone a picture or scan of my face for $5. But I even
> more happily have that chance go to someone who needs it more than myself.

Would you really? My gut feeling tells me that's not the case for most people
for privacy or ethical reasons. Just because those people are poor, we expect
them to have lower privacy or ethical standards.

The link you posted has the following example, I think you're referring to
that

> BBH Labs was an exception – they outfitted 13 homeless volunteers with WiFi
> hotspots and asked them to offer WiFi to SXSW attendees in exchange for
> donations. In return, they would be paid $20 a day plus whatever attendees
> gave in donations.

That's completely different. Offering Wifi has zero long term effects. It's
providing people with a "business opportunity" that wouldn't have access to it
otherwise. Giving someone 5 bucks for their face picture (or other biometrics)
is totally different and has long term negative effects.

~~~
aantix
I don’t think most people care about privacy all that much.

Most live publicly with their faces on display for all to see and others
taking it a step further, participating in Facebook alongside billions of
others.

It doesn’t scream facial identity being a major concern.

~~~
JohnFen
I think most people actually care quite a lot.

That people live their lives accepting that their faces are on display is not
evidence otherwise, since there is literally no other option.

Participating Facebook is also not evidence otherwise -- at most, it's
evidence that people are willing to trade privacy in some circumstances (and I
think even that's a bit of a stretch), but I'll bet that most Facebook users
would object to having their privacy invaded without their consent -- which
means they care about privacy.

~~~
aantix
>since there is literally no other option Living remotely in the
mountains/desert/jungle?

I say this with seriousness. When considering this alternative, the option of
living alone, without human interaction, public identity shows it positive
attributes.

~~~
JohnFen
Living in a remote location doesn't take away from the fact that you still
must spend at least some time in a public space.

------
SamBam
It sounds like there were three issues:

1\. The contractor targeted homeless people

2\. They targeted people with darker skin

3\. They may not have been forthright or truthful about what they were doing.

Number 3 is clearly wrong. But I think so long as the contractors were upfront
and truthful about what they were doing, I don't know if 1 or 2 are
problematic.

The only argument I can see for why they shouldn't pay homeless people money
for an easy job is that the prospect of money might be so enticing that
they're willing to give up personal rights or freedoms (the same argument why
we don't allow selling of organs). But $5 neither seems high enough, nor the
process invasive enough, that this argument would hold water.

As for ensuring that enough of a sample range is in the database as an attempt
at avoiding data bias, this should be a no-brainer good thing.

~~~
aeturnum
If this report is correct it seems like they were targeting a vulnerable
population based on the logic that their vulnerability made them less likely
to insist on being treated better than google wanted to treat them. That seems
really bad to me.

If you're asking folks on the street and happen to get a lot of unhoused folks
because they're around, that's fine. Writing memos telling people to target
vulnerable populations because they're vulnerable is gross and deeply
unethical.

~~~
kapuasuite
You’re using words like “targeted” and “vulnerable” to imply there’s some kind
of harm being done here, which is laughably false. Just because something
strikes you as wrong (for whatever ridiculous reason) doesn’t mean it’s
actually wrong, let alone “gross and deeply unethical.”

Are osteoporosis researchers unethical for “targeting” women?

~~~
aeturnum
The word "targeted" appears in the document quoted in the article. I think
describing the homeless as a vulnerable population is a pretty uncontroversial
thing to say, but if you'd like I can only use language from the article.

It's gross and deeply unethical to "target" homeless because they "didn’t know
what was going on at all." Giving your subjects or customers incomplete
information, such as "characterizing the scan as a 'selfie game'," is also
clearly unethical.

~~~
kapuasuite
Feel free to point to literally any harm to these people - there isn’t any.
We’re talking about capturing faces in a public place where there’s no
expectation of privacy. What does “gross” even mean beyond “it offends my
sensibilities?”

~~~
aeturnum
Gross doesn't mean anything beyond "it offends my sensibilities." I also don't
know if the individuals in this situation feel harmed - nor do you, because
that's not how harm works. What I did say was that the actions were unethical
because they made discovering and communicating harms less likely.

Ethics[0] is not exactly concerned with material harms to people. You can
unethically help people and ethically harm them. That said, what is considered
"ethical" is generally based on ideas of what should promote the beneficence
of everyone in the situation. Choosing to engage /specifically/ with a group
who you think is unlikely to detect deception or tell others about your
interaction is unethical.

If you aren't sure why, here are the problems I immediately think of:

\- If you are misleading people, it's likely that you're worried they will
perceive a harm (real or imagined) if they were fully informed.

\- If you are seeking out people who are socially isolated because they are
unlikely to speak to others, it suggests you are concerned about the outcome
of them telling others about your actions. You can also see this in abusive
personal relationships where the abusive party will socially isolate their
victim.

\- Both of these conditions (knowledge and social capital imbalances) make
understanding the impact of a relationship (positive or negative) difficult to
determine. It might be that no harm has been done, but the account in the
article suggests that the contractor went out of their way to create
conditions where, if the participants were harmed, they would not be aware of
it or would not be able to communicate it.

Lest you think this is all bleeding heart hand wringing, you can see these
same principles encoded in economics. Contract law has the notion of material
misrepresentation and there's lots of economic theory around the harms of
information asymmetry (you could also look into companies that are convicted
for material misrepresentation in advertising).

All of which I find gross.

[0] [https://www.merriam-webster.com/dictionary/ethic](https://www.merriam-
webster.com/dictionary/ethic)

~~~
kapuasuite
Is it really so hard to imagine they lied because people wouldn’t want to
participate otherwise, regardless of any actual harm? There are plenty of
people out there who think it’s illegal to take their picture or record them
in public without their consent. “I didn’t consent to this” isn’t a trump
card.

Material misrepresentations are a big part of contract law because they almost
always, ya know, cause harm. Not too many people out there cooking the books
to make their company cheaper to buy, for example.

------
ehsankia
At least the Verge has the less clickbait headline, mentioning that it was
contractors. The original source mentions Google in the headline but the rest
of articles only refers to Randstad.

One part that is a bit confusing to me is, the original source makes no
references whatsoever to any consent form. Usually you can't collect this sort
of data without signed consent, and previous reports [0] do mention such a
form. I know most people don't read the form, but I'm curious how you can get
away with telling someone you're just playing a game and lie so much when the
form should clearly state what you're collecting.

Still, there should definitely be better vetting of contractors and stories
like this definitely look very bad, even if the intentions were actually to
help reduce ML bias.

[0] [https://www.engadget.com/2019/07/29/google-paid-for-face-
sca...](https://www.engadget.com/2019/07/29/google-paid-for-face-scans-to-
improve-pixel-4/)

EDIT: The original article does indeed mention an show a picture of the
"agreement".

~~~
xenocyon
IMHO an entity is ethically responsible for anything its contractor does,
unless there is reasonable clarity that the contractor acted in opposition to
the client's wishes. Having someone else do your dirty work doesn't make it
less dirty.

~~~
Kylekramer
If I order a pizza and the stoned teenager from the local joint runs over a
pedestrian, am I ethically responsible? I didn't tell them to not hit people
when getting my meat lovers to me and did say I wanted it ASAP.

~~~
CaptainZapp
The pizza joint hired the stoned teen, he's not contracted by you.

It's a pretty rotten analogy.

~~~
Kylekramer
The pizza joint hired the teen, I contracted the pizza joint to get me a
pizza. The teen did it a shitty way that is hard to be expected.

Randstad hired the people, Google contracted Randstad to do a job. Randstad's
people did it in a shitty way that is hard to expected.

It's a pretty fresh analogy.

~~~
JohnFen
I think the analogy is a poor one, but let's run with it anyway.

No, you are not responsible because the teen was not operating within the
terms of the contract. You did not authorize or ask him to get stoned or run
someone over.

~~~
privateSFacct
Did google authorize (or even think) that their staff would contract with a
company that would hire folks who would scan homeless people's faces in a way
that was improper?

~~~
JohnFen
I don't know. I do know that in these kinds of situations, the contract
typically spells out clearly what, exactly, the contractor is going to do and
how, though.

~~~
bduerst
Even though it is usually reviewed, the methods of data gathering are usually
not spelled out in the contracts themselves. Instead you have a distribution
of liability where the data broker agency assumes the risk of having gathered
and/or sold the data incorrectly to the purchaser.

------
TallGuyShort
A while ago Google got bad press because some image-tagging service identified
some people as "gorillas", and IIRC it was blamed on not having enough
diversity of skin color in the training data. So... it sounds like at least
the "instructed them to target people of color" part of this is them trying to
correct that. But in isolation that sounds even worse than the first instance.
I guess you're damned if you do, damned if you don't.

~~~
JohnFen
> I guess you're damned if you do, damned if you don't.

This is not a case of that.

Google (or its contractor) could easily have done this in a way that was not
objectionable. They simply decided not to.

~~~
kapuasuite
What would make it “not objectionable?”

~~~
kevingadd
For starters, telling the people what you're actually doing and maybe paying
them more than 5 bucks.

~~~
kapuasuite
You can argue that lying would be objectionable, but we’re talking about
something with essentially no consequences here. As for the fee, how much are
their faces worth beyond a fee they were obviously already willing to accept?

~~~
autoexec
> but we’re talking about something with essentially no consequences here.

Just because it's difficult to identify the harms caused by someone stealing
your biometric data that doesn't mean there are no harms. Gaining access to
someone's biometric data clearly opens them up to certain types of risks
ranging from identify theft to surveillance. Fraudulently gaining access to
someone's biometric data is wrong even if the data is never abused or
exploited.

------
kpx11
Next news: Google engineer sneezes in subway, google trying to infect people.

I mean there's no need to have google's name in there, other than to click-
bait-trick people into viewing their subpar journalism with ads.

But, It's kinda shitty to cheat people no matter what. You cannot say hey
pixel 4 is gonna have face unlock and i want you face scanned for that
obviously, but contractor should have done a better job.

------
DoreenMichele
There's a long, long history of leaving women, people of color, poor people
and other groups out of data sets. For example, I've read articles that
indicate we can't create good photos of people of color because film standards
were normalized to white skin.

So, try to fix that and... there's hell to pay?

File under: "No good deed goes unpunished."

~~~
mattnewton
I don't think there is anything wrong with the memo saying "we need more
people of color in our data set." I hope everyone agrees with that.

What seems to have been bad is the contractor misinforming people about what
data would be collected (and for what use), and it's not clear what Google had
in their contract to prevent that kind of unethical behavior. It is also very
questionable IMO to target the homeless "because they won't talk to the media"
which was allegedly in the instructions the contracting firm Randstad gave to
it's workers.

Disclaimer: While I work as a low level employee at an unrelated team in
Google, my opinions are my own and do not represent those of my employer, and
this is the first I am hearing of this.

~~~
DoreenMichele
_It also seems bad to Target the homeless "because they won't talk to the
media" or whatever the quote was._

To me, this just parses as "We have some new Politically Correct excuse to
exclude poor people from our dataset."

Being so unimportant that the world wants you to remain invisible isn't
generally a good thing.

There is always some excuse. There is no condition under which it is
sufficiently respectful, politely handled, blah blah blah to be A Good Idea.

------
gerash
This seems like a fake outrage to me. If I were a homeless I'd be more than
happy for someone to take a photo of me for $5. In fact, I'd find it pretty
hypocritical if someone was spending their energy fighting against possible
infringement of my rights in such scenario instead of actually providing me
with money or food.

~~~
autoexec
> If I were a homeless I'd be more than happy for someone to take a photo of
> me for $5.

The problem isn't that they were offering money in exchange for photos of
homeless people it's that they were tricking homeless people into giving up
their biometric data by telling them they'll pay them $5 just to play with a
phone for a few minutes.

If they were honest about what they were taking and why I wouldn't have a
problem with it.

------
l_t
> _Google and Randstad didn’t immediately reply to requests for comment._

The content of the article is interesting enough, but this line at the end
caught my attention.

Is it reasonable to expect someone to "immediately reply" before you publish
the article? Because that doesn't sound like ethical journalism to me, unless
I'm misunderstanding the meaning of "immediately" in this context.

~~~
ry_ry
Doesn't matter, I suspect - It works within the narrative and implies they
have something to hide. It's good /tabloid/ journalism, and poor investigative
journalism.

Randstad are very much the former.

~~~
kevingadd
Historically if you wait multiple days / weeks to give them a chance to reply
they just do an end-run and publish puff pieces in major outlets to try and
defuse your article. There are multiple cases of this in the past year.

It's either investigative journalism or it's not. How long you wait for
comment has nothing to do with that. Do you really think this is equivalent to
tabloids posting faked photos of some movie star's belly?

~~~
ry_ry
That's a disingenuous comparison.

How long did they wait before publishing? We don't know, it doesn't matter.
Simply stating that they didn't respond had the desired effect, and - as you
rightly pointed out - does nothing to diffuse the story. The implication of
the story being that not only are Google potentially taking advantage of
vulnerable people to further their unspoken, morally grey agenda, but it may
also have a racially questionable angle.

Alternately, they wanted to train their facial recognition dataset with
certain characteristics on the cheap.

That in itself is interesting, but probably wouldn't get as many clicks. It's
bottom drawer "I leave you, dear reader, to draw your own conclusions" stuff.

------
jonas_kgomo
This is too desperate. It seems AI will help in criminalizing and unbiased
profiling of people of color, as a person of color it feels really hard to
imagine a future where justice is done with due diligence.

Joy at Media Lab has been looking at this issue for a while and advocating for
balance. [https://www.technologyreview.com/s/612775/algorithms-
crimina...](https://www.technologyreview.com/s/612775/algorithms-criminal-
justice-ai/)

Also I find it weird that Nvidia was able to simulate realisticly looking
people last year, and Google is struggling to find humans, can't they use that
as ground-truth?

------
willwashburn
The first thing they teach you about research, do not do it on the vulnerable
population. It's not like people going to the public would matter; The Pixel
4's features and hardware all leaked way before the announcement.

------
andrerm
> “They [Google contractor] said to target homeless people because they’re the
> least likely to say anything to the media,” the ex-staffer said. “The
> homeless people didn’t know what was going on at all.”

------
curiousgal
How dare they give $5 to homeless people instead of college students walking
around campus with airpods.

~~~
journalctl
Yeah, usually you have to pay at least minimum wage to exploit college
students.

------
microtherion
Life imitates video games:
[https://www.youtube.com/watch?v=BZ6TuxmgJN0](https://www.youtube.com/watch?v=BZ6TuxmgJN0)

------
kapuasuite
This really counts as “extreme and unsavory” these days?

------
rootsudo
Same thing panasonic does in Japan to improve facial recognition - except they
just pay foreigners 5000 JPY and tell you what they do.

------
jdkee
Google seems to be outsourcing their ethics.

------
olalonde
Anyone knows why they were giving gift cards instead of cash? I suppose
there's a legal reason.

~~~
Notorious_BLT
I can think of several reasons that homeless activism groups would suggest as
reasons to use gift cards instead of cash. Most of them are pretty evident if
you consider what kinds of goods are only available through cash-only
purchases.

------
sarcasmatwork
Selling ones privacy for a $5 Starbucks G/C. They know their gullible
audience!

------
NN88
This is blatantly unethical.

Expanding your database? Great

Forgetting situational ethics? Disgusting

------
acdc4life
Why doesn’t the model trained on one race generalize to different races? That
sounds inferior to human vision.

~~~
kllrnohj
"All X's look the same" where X is any of [Asians, Black people, white people,
etc..] is a very common refrain.

Human eyes also need to be trained on diverse data. It's the cross-race
effect: [https://en.wikipedia.org/wiki/Cross-
race_effect](https://en.wikipedia.org/wiki/Cross-race_effect)

~~~
acdc4life
The wiki post mentioned that people are right 45% of the time. How does dl and
ml stack up? Also not mentioned, how long does it take for humans to start
recognizing new races, vs ml and dl?

------
mgraczyk
This strikes me as the ethically best possible way to collect this data.
Google is paying people who need the money for something simple and completely
harmless.

The main counterargument appears to be that those who sold data "didn't
understand what was going on". It's hard to imagine moral convictions in which
someone could consistently argue that the homeless don't understand money in
exchange for photos, but it's acceptable to leave them to fend for themselves
on the street.

Google is, at worst, helping people who need help.

~~~
rmsaksida
> This strikes me as the ethically best possible way to collect this data

"a contracting agency named Randstad sent teams to Atlanta explicitly to
target homeless people and those with dark skin, often without saying they
were working for Google, and without letting on that they were actually
recording people’s faces"

How can this be the most ethical way to collect data?

The problem isn't in acquiring facial recognition data from homeless people,
but in mischaracterising the nature of the experiment when doing so. If the
reporting is accurate, they lied to vulnerable people and tricked them into
selling their data for cheap.

Companies can't go around hustling people into giving away their private
information. It doesn't matter if you think this is "for their own good", a
homeless person may want to refuse being catalogued by Google for a variety of
reasons.

