
Google Has a Striking History of Bias Against Black Girls - bryanrasmussen
http://time.com/5209144/google-search-engine-algorithm-bias-racism/
======
sctb
All: we're aware that this article may have an ideological slant that is
either positive or negative depending on your own views, but it also has
interesting content. Please react to the factual information and steer well
clear of any ideological yay-nay combat.

------
labster
I think the thesis is wrong here. Google has a striking history of removing
bias against black women.

People make content for the web. People are racist, and make lots of racist
assumptions in their writing. Society is racist. This results in Google's
algorithms reflecting the corpus they scanned, and the searches that are made.
But the search is just a reflection of us, and how terrible we all are.

Google gets embarrassed by the results, and exercises editorial control over
search -- which leads to them actively removing racist systemic biases. Sure
it's reactive and not proactive, but it's movement in a positive direction.
Google likely spends more money on editorial control for issues of racism than
for any other thing that doesn't make money.

~~~
danso
I agree that there's too much blame-the-algorithm sentiment here, but if I had
to guess, I'd think the author would argue that these racist biases should and
could have been caught before going into production -- i.e. it doesn't have to
be an inevitable cycle of ad hoc public shaming and repentance. These biases
could be identified just as other SERP problems are identified before each
algorithm update.

Just like the HP "racist" webcams [0] might have been caught had the CV
trainers been more aware of what is/isn't in the training data, and/or if HP
had a few more testers of darker complexion.

And perhaps Facebook's "Year in Review" rare tendency to be "cruel" [1] would
have been mitigated with a broader group of devs and testers.

[0]
[http://www.cnn.com/2009/TECH/12/22/hp.webcams/index.html](http://www.cnn.com/2009/TECH/12/22/hp.webcams/index.html)

[1] [https://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-
al...](https://meyerweb.com/eric/thoughts/2014/12/24/inadvertent-algorithmic-
cruelty/)

~~~
naasking
> I'd think the author would argue that these racist biases should and could
> have been caught before going into production

I think people who believe this significantly underestimate the difficulty of
solving these problems.

~~~
danso
I think that's likely. But the people tasked to solve these problems aren't
always given the right priorities. Detecting/classifying the faces of dark-
skinned humans is inextricably more complicated than it is with light-skinned
humans. It requires significantly more training data, more computational
power, and more testing time to reach the same accuracy as a light-skin-only
camera and classifier.

We would deride Uber's self-driving vehicles if it turns out their systems
can't tell the difference between harmless (plastic bags) and dangerous
(jaywalkers, wildlife) objects on the road. Likewise, we can judge a company
for releasing a general consumer webcam that fails to fully function for
10-15% of the American population.

~~~
pixl97
>Likewise, we can judge a company for releasing a general consumer webcam that
fails to fully function for 10-15% of the American population.

But then you get into an even deeper question 'are minorities/blacks 10% of
said consumers of products?'. Poverty, for example, could mean that the
product doesn't work for 5% of the buyers, which then begs the question 'how
much effort do you put in fixing the problem for a small percentage of the
buyers?', especially when the amount of effort is going to be very large/time
consuming/expensive. If you are a person looking for racism, you will judge
google as racist. If you are looking at it from a position of a business
attempting to make a profit on a product, you will not see it as racist.

~~~
sangnoir
> But then you get into an even deeper question 'are minorities/blacks 10% of
> said consumers of products?'.

Maybe run that question by legal first! Protected classes exist for a good
reason (IMO). What percentage of a restaurant's business is from people in
wheelchairs, yet the law mandates ramp access.

------
yongjik
This is a much harder problem that it looks to a casual observer. Just imagine
what kind of people would type "black girls" to Google search. Not "black
girls outfit", not "black girls social bias", just "black girls".

I'll bet at least 50% of the searchers are looking for porn.

So even if Google had infinite resources, its choices are: (1) act as a moral
authority and deny pornographic contents to those who are looking for them, or
(2) show pornographic results to people searching for innocent stuff.

...Or maybe (3) built a perfect profile of every user and just show them what
they want. But do we want to go there?

(You might be thinking that "black girls" is such an innocent term that Google
should be able to determine that it's non-porn, but then the question merely
shifts to less popular queries.)

~~~
na85
I wish the article had included screen shots of search results for "white
girls" and "Asian girls".

Would there be any less porn? I strongly doubt it.

~~~
treis
"white girls" returns stuff about the movie white girl

"Girls" is the HBO show

"Asian girls" is porn lite

"Black girls" has no porn for me. Various sites about black girls (black girls
code for example)

~~~
brighteyes
It depends on your search history, actually.

I tried "black girls", "white girls", "asian girls" \- and also "asian boys"
etc. - in various accounts, some of which were used when browsing porn (i.e.,
home accounts vs accounts used at work), and in both google and bing.

In accounts where porn was browsed in the past, porn results were over 50% for
all of the above. In others, porn was absent.

~~~
ScottBurson
Excellent research! You should post this as a root comment so it has a chance
of rising to the top.

------
btilly
Machine learning systems are good at finding the unconscious biases of large
populations. Any change to force notions of political correctness on top of
that is very complicated, and makes the algorithm work worse for its intended
purpose. Literally, failing to accept the discrimination inherent in the data
loses you money directly or indirectly.

This is well-known, and not Google's fault. Getting offended at Google for a
reality they didn't create and can't control seems silly to me. If you don't
like the result, work to change the incentives. Or work to change people's
biases.

This kind of problem is not new. For example insurance companies have long
known that where you live affects how much they are likely to pay out. If they
base insurance rates on the data, the result is that your zip code becomes a
bigger determinant of insurance rates than your driving record. Which results
in very large effective racial discrimination. There are laws limiting that,
for example prop 109 in California. However it is an eternal struggle because,
in fact, competing insurance companies have a good motivation to have the cost
of insurance reflect their best estimate of the expected cost of insuring you,
and their cost really is a lot higher if you're in a "black neighborhood".

------
mlazos
I’m always amazed how often the “surprise” that digital technology is in fact
not unbiased and robotic in nature keeps getting written about. Algorithms
that optimize for click through rate for ads and search relevancy will
inevitably be a reflection of ourselves and all of the shitty things various
people believe.

>> What we know about Google’s responses to racial stereotyping in its
products is that it typically denies responsibility or intent to harm, but
then it is able to “tweak” or “fix” these aberrations or “glitches” in its
systems.

Only Google knows the fate of the code base that has workarounds for all of
these edge cases. I think the author is wrong that all of these edge cases
could’ve been caught beforehand. Some sure, but I think others would always
slip through.

~~~
Hyva
> I’m always amazed how often the “surprise” that digital technology is in
> fact not unbiased and robotic in nature keeps getting written about.
> Algorithms that optimize for click through rate for ads and search relevancy
> will inevitably be a reflection of ourselves and all of the shitty things
> various people believe.

What you just observed and described is that technology _is_ unbiased and
robotic. The bias is in the humans. You can't remove it with a software
bandage; the software is working as intended.

~~~
jancsika
> What you just observed and described is that technology _is_ unbiased and
> robotic.

Just to get calibrated: what is an example-- real or imagined-- of technology
that _is_ biased?

~~~
PeterisP
For example, a search algorithm with an override that removes all links of
sexual nature for certain hardcoded queries (say, "black girls") could be an
example of a biased technology.

If I'd have to make a definition, then it would probably be a technology that
somehow encodes an assumption that doesn't match reality. In this context,
saying that "black girls" is an innocent search term and users searching for
it don't expect to get sexual results is such an assumption - it would be
polite, politically correct and possibly socially desirable; but it seems
likely this is simply not true in the reality we live in.

~~~
sokoloff
I had a person on my team preparing a presentation and for whatever
(reasonable) reason was looking for a stock image of a rabbit in winter with
frozen precipitation around.

Just as our (very straight laced) CIO walked up behind, employee hit image
search on "snow bunny". Hilarity ensued...

------
taurath
It seems like the main crux of the argument is that being reflective of the
culture is also reinforcing it - the fact that say there aren’t as many stock
photos of 3 black teens while there is a lot more “mug shot” style photos is
probably reflective of the usual portrayal in media, but most people generally
agree that the culture shouldn’t be that way.

The question is, should search algorithms show what currently is (and
therefore reinforce what is), or what the culture aspires to (which in fact
more accurately matches the culture in terms of desire and movement)? I think
this is a pretty smart example of how the technology we use every day can be
passively malevolent - creating hurdles for change rather than enabling better
information transfer.

~~~
cat199
> should search algorithms show

radical thought (and goes for social media as well) -

how about making the results tunable based on user preferences & filters?

oooh wow.. user control! what a concept..

~~~
taurath
I think that might make for a more powerful and useful search engine for sure.
A Universal perspective as we've seen over the past few years simply doesn't
exist - a search engine equivalent of stating a perspective would be
interesting to think about. We already have location-based services, but we
don't have cultural perspective based services. News media (Huffpost, Gawker
etc) seem to be filling this role but only for news.

------
caconym_
> Images of white Americans are persistently held up in Google’s images and in
> its results to reinforce the superiority and mainstream acceptability of
> whiteness as the default “good” to which all others are made invisible.

I think that implying intent on Google's part is going way too far.

Obviously search results, to a large extent, are going to reflect the society
and culture of search engine users. If our society and culture are shitty,
should search results pretend otherwise? I'm pretty sure that our society and
culture are rife with institutionalized racism and sexism and a whole lot of
other badness, and badness of search results seems to me to be a symptom
rather than a cause.

------
komali2
I like the underlying question that the author brings up - what's google
algorithm, and who's responsible for it?

In the old days it was "most linked page with those keywords," right? So if
you searched "gay man" and got a bunch of gay porn, is it necessarily google's
fault that the internet is more gay porn that it is resources for gay men to
discuss LGBT issues? Is it google's fault that news agencies more typically
report on crime by black people than white (or that black people are more
likely to be convinced/arrested in the first place?)

I think Google _is_ responsible in 2018, now that they offer extremely
tailored results. Duckduckgo, maybe not. But if Google is going to say "we're
going to show you listings for restaurants in SF in your simple 'restaurants'
search because we know you live in SF," I think they should also say "we're
going to show you results for LGBT resources when you search for 'gay men'
because nothing about your query indicates a desire for pornographic content."

~~~
ryanmonroe
>I think Google is responsible in 2018, now that they offer extremely tailored
results. Duckduckgo, maybe not. But if Google is going to say "we're going to
show you listings for restaurants in SF in your simple 'restaurants' search
because we know you live in SF," I think they should also say "we're going to
show you results for LGBT resources when you search for 'gay men' because
nothing about your query indicates a desire for pornographic content."

Are you are saying that, in a scenario where there are more results for one
sub-query than for another, Google has an obligation to assume someone is
searching for the sub-query it finds more politically acceptable? This seems
like a bad idea to me. We should want to _reduce_ the political/social
influence of Google, rather than think of ways to give them more tools of
influence and assuming they will have a positive effect.

These two things are completely different

"Because you live in SF, if you search restaurants you probably want results
pertaining to restaurants close to SF"

"Because subject X-A is socially unacceptable, if you search for subject X we
will only show results pertaining to subject X-B"

~~~
narrator
Political correctness and affirmative action are based on the idea that
pretending the world is an ideal version of itself will eventually make it
that way.

In reality though, the world is a complex place and the ideal pretend version
can only be implemented in a carefully controlled environment such as a movie,
a theatrical presentation, a video game or a situation where all participants
agree to or are pressured into behaving in a way in which that pretend world
is real. When the rest of the world leaks in, it is impossible to maintain
that pressure on everyone.

About the best we have so far is "safe search" since pornography and obscenity
follow fairly regular patterns. Political correctness though is constantly
evolving and requires a trained academic to determine if each piece of
information should be censored or allowed. Ideally in the future we'll have
google glass and advanced AI classifiers constantly scanning our vision and
providing us with tape delayed audio of the outside world in order to present
the world to us and filter any politically incorrect content 24/7.

For now we have to do it ourselves and ignore any inconvenient facts that may
make maintaining the illusion difficult. This constant struggle to deal with
cognitive dissonance can be tiresome and one should restrict oneself to
heavily moderated news feeds and not using search features for now. To create
the carefully controlled virtual world in an interface to the rest of the
world such as Google is an overwhelming task because you would need a highly
sophisticated trained AI agent to maintain that illusion and to even invent
information, such as lists of Nobel award winners from disadvantaged peoples,
in order to maintain the illusion.

~~~
jacobreg
>Political correctness and affirmative action are based on the idea that
pretending the world is an ideal version of itself will eventually make it
that way.

That is the opposite of how it works. Affirmative action is accepting that the
world has bias against certain groups, and attempting to take that into
account. Political correctness is about realizing that certain groups have
been marginalized or traumatized and some words perpetuate that. In a perfect
world neither are necessary.

~~~
narrator
The proof that a group has been marginalized is that their numbers are not
representative of their presence in the population. Thus, we pretend that the
ideal world exists by adjusting those numbers in situations where merit
selection is the criteria in the hopes that the numbers will converge by
themselves. We don't interview each participant to ask them how they were
traumatized, we only look at the statistics. People from strong well adjusted
minority families that value education and have protected their kids from most
of the harsh realities that poor minority families are subject to receive the
same special treatment with affirmative action. The reason that the inequity
exists is immaterial means that we don't even have to check if a bias exists,
we just assume that since the world is not ideal that the numbers need to be
fixed first and then the rest of it will follow.

At least that's the way it works in reality. Maybe this reality is politically
incorrect too and you are saying that we need to implement meta-political
correctness in which the means of creating an ideal world must be hidden to
instead pretend that the implementers know the exact and particular
circumstance of each person they are selecting based on facts other than merit
and are weighing all of them appropriately in correcting injustice. By
pretending that this ideal world in which knowledgeable administrators
skillfully and in each individual case correct injustices, this will somehow
make it become a reality.

~~~
tanilama
[https://www.theatlantic.com/education/archive/2017/08/why-
me...](https://www.theatlantic.com/education/archive/2017/08/why-men-are-the-
new-college-minority/536103/)

Statistics show that man are the new minority in the campus. But it is surely
will be frown upon by some AA activists if a that program is extended to cover
male. If AA is said to what it is trying to accomplish, then that shouldn't be
controversial at all.

~~~
komali2
>But it is surely will be frown upon

I think this is presumptuous. We can try to find out, why not?

------
politician
Google search results reflect what's in use on the web via PageRank and modern
PageRank-like approximations. Google and its human staff does not apply human
intellect to the problem of sorting, picking, and ranking each search result.
Moreover, Google has included "Safe Search" features for several years that
would remove these types of results from search.

Before we condemn Google for the collective actions of the broader Internet
community of websites, we should take a moment to understand this point.

Now, it is possible to create a search engine curated by humans or filters
that remove objectionable content, and perhaps that approach might be
desirable to some - a "MyGoogle" experience with search results customized
based on Google's interpretation (or explicit settings) of the visitor's
demographics, political beliefs, trigger words, and purchase history.

~~~
jacquesm
> Google search results reflect what's in use on the web via PageRank and
> modern PageRank-like approximations.

PageRank is only a fraction of the total inputs to Google search. Once
PageRank was described it was subject to being gamed at such a scale that it
essentially destroyed the web, ironically making Google that much more a
necessity.

> Google and its human staff does not apply human intellect to the problem of
> sorting, picking, and ranking each search result.

Yes they do. By tweaking the weights of certain output categories (such as
evidenced in the article, a clear drop in the number of pornographic or semi
pornographic results as a result of such a tweak) there is a large amount of
influence exerted on the results.

> Moreover, Google has included "Safe Search" features for several years that
> would remove these types of results from search.

Safe search always was a weird one: You'd expect the opposite, a 'smut search'
(Tom Lehrer would have a field day with that one).

> Before we condemn Google for the collective actions of the broader Internet
> community of websites (and the author's step of disabling Safe Search), we
> should take a moment to understand this point.

Before we invalidate the authors point by handwaving and purposefully
injecting chaff into the conversation let's try to understand the actual point
they are trying to make: That a query for an innocent term such as 'black
girls' even with 'safe search off' should not result in a bunch of porn.

> Now, it is possible to create a search engine curated by humans or filters
> that remove objectionable content, and perhaps that approach might be
> desirable to some - a "MyGoogle" experience with search results customized
> based on Google's interpretation (or explicit settings) of the author's
> demographics, political beliefs, trigger words, and purchase history.

That's nothing to do with the authors point.

~~~
politician
> That a query for an innocent term...

I take strong issue with the "obvious" conclusion that Google knows which
terms are innocent. Disclaimer: In this specific case, I agree with you that
"black girls" has an innocent connotation and am not suggesting otherwise.
Please do not misinterpret my thoughts below.

Shall we require Google to be ever vigilant about the meaning of words in use
at various times by the various communities of the world, and to entrust them
with the responsibility for determining the innocence or guiltiness of words
or phrases for all mankind?

If we shall require such effort of Google, ought we not elevate Google to the
role of judge over other matters of humankind given their good stewardship
over matters of innocence and guilt?

Taken to the limit the argument advocates the establishment of an echo chamber
to wrap tightly around each one's subjective interpretation of their reality.
Isn't this the exact opposite of what we should be trying to achieve?

Have we not had enough of this Brave New World?

~~~
jacquesm
> Shall we require Google to be ever vigilant about the meaning of words in
> use at various times by the various communities of the world, and to entrust
> them with the responsibility for determining the innocence or guiltiness of
> words or phrases for all mankind?

Google aims to 'organize the worlds information', how it does that concerns
all of us so yes, this makes them responsible with determining whether a
certain set of words has an innocent or negative interpretation.

It's not the world I want to live in but I live in it nonetheless. That Google
should not have this power to begin with is obvious but here we are.

> If we shall require such effort of Google, ought we not elevate Google to
> the role of judge over other matters of humankind given their good
> stewardship over matters of innocence and guilt?

Definitely not, why make things even worse? They already have _too much_ power
we will not make things better by compounding the problem.

> Taken to the limit the argument advocates the establishment of an echo
> chamber to wrap tightly around each one's subjective interpretation of their
> reality. Isn't this the exact opposite of what we should be trying to
> achieve?

It is, but again, the filter bubble is real and the more customized Google
search results are for an individual the worse this gets.

> Have we not had enough of this Brave New World?

I do, and so does the author, but clearly you are either missing the point or
you have not had enough yet.

~~~
politician
You're right! I reread the article, and the author aligns ideologically with
most of the commenters in this thread, including myself.

------
jimrandomh
I tried replicating the searches described in this article, in an incognito
window. Most of the searches described inthe article are quite dated; they're
also spread over a significant amount of time. None of the searches I tried
came out the way the article described. There was no porn in the results for
"black girls"; the third result for "professor style" was a black man, and
there was a black woman further down on the first page. Search results for
"three white teenagers", "three black teenagers", "professional hairstyles for
work" and "unprofessional hairstyles for work" were dominated by links to the
very discussion this article is reporting on, which sort of suggests they
weren't queries anyone was making to begin with.

~~~
SkyMarshal
How do you know the algo/search results weren't tweaked after these critiques
went public but before your tests?

~~~
ggggtez
Obviously they were. The point the parent post is making is that some of these
results are old (like, 10 years old). Of course the algo would have changed,
and the article even says that it did.

But if no one is searching it, then of course it's just going to be semi-
random results.

[https://trends.google.com/trends/explore?geo=US&q=%22three%2...](https://trends.google.com/trends/explore?geo=US&q=%22three%20black%20teenagers%22)

The top related query? "Three white teenagers". It's obvious almost no one
searches this term unless they've read the article.

------
fmitchell0
What I took from the article were a few points:

* Taking a neutral position when building something (its just a tool, it's making decisions based on an aggregate) IS in fact making a decision. State can be -1, 0, or 1 and neutrality yields a certain result.

* The local optima of these algorithms are likely a product of local optima to the people and teams who built them. The neutral position seemed like the correct answer.

* These articles serve to reinforce the idea that the next frontier is context within algorithmic results. This global optimum can only be achieved through diversity of perspective. Perspectives are a product of one's experience, so the value of diversity can be directly connected to the goal of desiring a optimum solution towards a hard problem.

FWIW, I am a black heterosexual engineer and my experiences inform my
perspective on these type of problems and blindspots affecting people not like
me, especially women.

For those wanting to go deeper on the technical, scientific, and mathematical
basis for what I'm referring, google 'diversity local optima'.

~~~
dahdum
My takeaway was that algorithmic solutions alone will never be enough since
they operate at the core on user intent.

Males searching for “FOOBAR girls” have, on average, different intent than
women. If they represent the majority of searches, algorithms will naturally
weigh the results they click more highly.

Hyper personalization is seen as an answer, but comes with the downside of a
reinforcing bubble and drift to extremes. Human intervention and editing are
only a partial solution, bringing up further questions about censorship and
which ideology is correct.

No easy answers to all of this.

------
MichaelGG
What did she expect for "professor style"? If I were hand-delivering results
and knew it was from a black, asian, or white person in the US, why would I
return different results? Someone entering a general query like that probably
wants what the common interpretation is. Which is not going to be a black
professor. Searching "professor style for black women" returns the results she
presumably thinks should be the default for her.

As much as I hate Google, they seem to be delivering exactly what they should.
Their job isn't to try to change people's minds, it is to find what they are
looking for. What's next? Search for "geek guy" and it's all white. Should we
be upset? Someone searching for "geek guy" is probably looking for exactly
what Google returns in this case and all the others she highlights.

As far as "black girls"... What kind of results are people expecting? Who is
actually searching for black girls? Probably people looking for porn... Why
should Google return suboptimal results because you got offended?

Same for "three black teens". It's not Google's fault that so much news is
reporting on crime committed containing that phrase. Did she actually check
crime statistics? Maybe "three black teens" get mugshots and commit crimes
together at a vastly higher rate than "three white teens". Outside of porn,
who is searching for this phrase? If you want "clean" and friendly results,
try searching "stock photo three black teens".

If anything, this article shows that Google's personalization should detect
people that want to search things (that they wouldn't really search for in
earnest) in order to be offended then display poor quality results to avoid
this "issue".

In truth though, I would imagine that if she and other black women often
searched "professor style" then didn't click any results and immediately
searched for "professor style for black women" then Google would pick up on
their results being suboptimal and change. My guess is that no one actually
makes these searches outside of making a political point, so personalization
cannot fix it.

------
danso
It's always bothered me when people, (justifiably) frustrated with the bubble
effect that personalization can effect upon search results, want to see Google
return to "objective" results. But PageRank/BackRub was never an objective
metric, it was always affected by what webmasters chose to link to and the
semantic markup they wrote. It seemed to be a good metric for a majority of
these things, but assuming few webmasters in the early years were black women
or black, then the "objective" metric of PageRank/BackRub is going to be
dismal.

So yes, I agree that more pressure needs to be put on the algorithm-makers, so
that there isn't complacency about accepting the results of "pure"
computation. But I feel too much weight is given toward the sentiment of " _If
Google isn’t responsible for its algorithm, then who is?_ ", as if the
problems could be wholly or even substantially fixed through "modifications to
its algorithm". The other side of the equation is the _data_ \-- and diversity
is as important to have in media as it is to tech, even if tech feels like the
stronger, more immediate lever in changing things.

Now that it's been a long time (in tech time) since 2010/2012, it'd be
interesting to hear exactly what modifications/improvements Google made to its
algorithm to return such different results for "black girls". Was it a de-
emphasizing of negative/outdated sources? Or an arbitrary boost for such
entities as "Black Girls Code"? Or was BGC not given an arbitrary boost, but
found to have been unjustly ignored/under-weighted when it came to determining
SERP? I understand not wanting to reveal details of search engine tweaks the
day/month/year after they've been implemented. But half a decade later,
hopefully learning more details of these "tweaks" won't leave Google
vulnerable to black-hat SEO.

------
ucaetano
Try searching Google Images for "brazilian".

Her point is quite valid, and also well recognized in at least some part of
the tech industry: when you "blindly" optimize for a certain audience, you
risk picking up the biases of that audience.

So, if you want to appeal to a broader audience than your current one, you
might need to manually tune your system to remove some of the biases from your
original audience.

This is particularly important, if not from a human perspective, at least from
a business perspective, when the bias of your current audience is about the
larger audience.

~~~
sosuke
Wow, those results are so different from Bing Images for the same search.

~~~
Consultant32452
That's really amazing. There's a similar situation when you search "American
scientists." Interestingly the regular search results are very different, but
the image search is similar.

------
SCHiM
I think the problem is that we're turning the 'learner' around in these
situations. It used to be that computer interfaces were static, any
shortcuts/quirks it had were learned by the human who then used to their
advantage.

These days we have systems that is not as smart as us trying to figure us out,
with predictable results. Tricks and quirks are ever shifting. A query that
used to produce the results yesterday might not function the same two weeks
from now. Any skill gained in navigating the system needs to be constantly
trained and updated.

I believe something similar happened to Hawking, where he had a keyboard that
was not smart, but was predictable so that he was the one learning to use the
system. He disliked a new version of the software that would not behave in a
predictable way because it 'learned'. I can't find the link anymore sadly.

------
wrs
Isn't this basically the same phenomenon (and result) as instant retweeting
and the like button?

If you simply reflect and amplify the behavior of the general population, you
can get some pretty ugly results — which is because the general population has
some actual ugly tendencies that are being accurately depicted. So the
conversation needs to be about how we should actively _bias_ the algorithm to
damp out the ugliness, which is already being done by Twitter, FB, Google,
etc.

I think we've done the experiment and proven pretty conclusively that a
totally unbiased algorithm or information-sharing network is not a good thing
for society, unless you want society to be like 4chan. Of course, that
immediately opens the can of worms — who gets to decide how to bias the
algorithm?

------
TangoTrotFox
Search for 'white man and white woman.' The reason I mention this is that the
article implies this is some sort of result of social bias and stereotyping.
As the result of that search shows (it's entirely interracial couples), it
most certainly is not. No, it's a result of the fact that Google's search is
not particularly smart. It's arguably the best there is, but it still comes
down to correlating words to pages and that correlation is far from
intelligent.

And as an aside, I think the term of bias in algorithms is misused when they
are operating as intended. You'd call a poll biased if it implied it was
polling group 'x', but did not actually do so due to some inadvertent bias.
You wouldn't call it biased if it polled the group it stated it was going to
poll and then gave the results - even if those results were not what you
personally would want to see. If people are mostly searching for interracial
couples when they search for 'white man and white woman' or they are mostly
searching for porn when they search for 'black girls' then a biased algorithm
would be one that actually returned white couples or non-pornographic results.

------
andkon
I really appreciate the point: as much as the content of what's on google has
nothing to do with google, the prioritization of search results is on them.
Whether it's hard or not to avoid perpetuating stereotypes doesn't seem to
obviate their responsibility to not perpetuate stereotypes, any more than the
fact that Facebook's having so much content flow through their feeds could
remove Facebook's responsibility to not help the spread of fake news.

Google's search results is really the oldest analog we have for the "machine
learning learns racism" quandary, too. This question is just going to get
bigger, and its effects much more pernicious.

------
krona
Several of the searches given as examples of racial bias are already in a
feedback loop; if everyone on the Internet starts writing articles saying 'OMG
search for _unprofessional hairstyles for work_ and look racism' then it
shouldn't come as a surprise that the first results for otherwise innocuous
phrases become skewed, and removing this so-called bias actually requires
direct intervention.

Although if one actually compares the results for _[un]professional hairstyles
for work_ there doesn't seem to be any negative bias; black people are well-
represented in both result sets.

------
squarefoot
I'm usually very critical of Google for other reasons (privacy, android etc.)
but I don't see them at fault in this case, at least not of being voluntarily
racist. Their algorithms "evolution" is based on what people searches and find
useful, which means if say someone clicks on a link and stops searching for a
while it could mean the user found what he/she was looking for. Now if
thousands of people search for "black girls", the search returns say some porn
links among others and most users click on them, could we expect the
algorithms to ignore porn or rather to slowly adapt to what most users
considered the right result?

The other points look also valid to me, but again, I don't see why Google
should be labeled racist when their engine adapts to what news sources
publish. Somewhere in the code there could be an association like 3 teens ->
gang and a piece of code skimming news sources that reacts to all articles
containing the word "gang"; now just mix the two and what you get? Just try
searching for "gang" and look at the colors predominance. Search engines
aren't easy to write. If I was an Eiffel language expert I would write a book
titled "A tour on the Eiffel language" just to prove that. Try searching that
phrase without getting submerged by images of the famous tower. Incidentally,
a fun search for "a tour on the Eiffel schnitzel kraut" yielded about 90% food
related results and only about 5% tower images although it still contained
100% of the terms for both, but having the word Eiffel not immediately
following tour could mislead the search engine, so I changed it to "the tour
Eiffel on a schnitzel kraut" and results slightly changed with more towers but
also more unrelated stuff like Stonehenge.

To me a search engine should return results based on harsh reality if
necessary, not church choir dreams; policing search engines results in pursuit
of politically correctness would be very dangerous. We won't see that day, nor
will our grand-grand-grand-grand-kids, but one day eventually racism will
start vanishing from society by itself and search engines will follow, and/or
we'll be socially mature enough to ignore anything that could resemble racism
to a today's biased mind, giving it even less exposure.

------
curtis
Back in the 90s, search engines were notorious for serving back lots of porn
links interspersed with whatever thing you were looking for. I was unusual
(apparently) in that my searches tended to be really specific and I rarely got
a lot of porn results. Except one time. I was looking for information on gun
violence in Europe and I was getting back not just a few porn results, but
lots and lots of them. I was using really simple queries, so I was able to
quickly isolate the offending term.

It was: "Sweden".

(Most likely this was on Alta Vista, not Google, but I don't remember for
sure.)

~~~
gaius
This is because marketing weasels had just discovered the hottest new tech of
the day: meta tags. And immediately set about polluting them. At one point in
the mid 90s about half the web that existed at the time was tagged “Anna
Kournikova naked”.

(Probably many on HN are too young to remember her... imagine Taylor Swift
x10, that was her popularity back then)

------
fipple
The headline should be "Google Search has a striking history of bias against
black girls," to clarify between Google the search engine and Google the
company that makes the search engine.

------
aje403
"What we need now, more than ever, is public policy that advocates protections
from the effects of unregulated and unethical artificial intelligence."

Artificial Intelligence has become an alias for "shit I have no knowledge of
but would like to pretend to have expertise on that is, at least tangentially,
related to a computer"

------
jacquesm
So Google now gives her different results when she makes the same queries.
Makes me wonder if this is when she is a logged in user, and what differences
there exist between logged in users one to another and non logged in users.

When I try it in both logged in and non-logged in mode I get reasonably
similar results, the differences are mostly in the ordering. When I do the
same on Bing I get about 20% or so pornographic images for both 'black girls'
and 'white girls'.

~~~
politician
I only get wholesome results for "black girls", so maybe Google already puts
us in our own echo chambers regardless of the SafeSearch settings.

~~~
jacquesm
No, Google has adjusted the weights of that particular output category, see
the article linked.

------
imh
There are some excellent ways of framing this problem already covered in the
comments. I'd like to add another. As time goes on, people seem to be treating
google searches less as database queries and more as a question/answer system.
Google is moving this direction too (e.g. "how tall is the eiffel tower" gives
an answer above the web pages).

When google was just an excellent database index and relevance ranker, then it
was ok for its results to simply reflect the underlying data. If the internet
is racist, then looking up things on the internet will find racist things.

As time goes on and google purports to give answers and facts and gets more
information-ey, they're kinda taking an editorial stance that the results
aren't simply relevant webpages, but correct answers in some loose sense. When
google tells me the Eiffel tower is 984' tall, and 1063' to the tip, they are
endorsing that as The Right Answer. Endorsements come with responsibility.

As Google pushes to give correct answers, and people increasingly expect it,
it's an endorsement of search results that comes with responsibility to not
just reflect the huge corpus that is the internet.

------
visarga
Incidentally I think - this is how you debias an algorithm: you marginalise on
all the discrimination criteria and check to see if it is balanced and
respectful on each subdomain. Google should have a set of automated tests for
that, and it's not that hard - a team of AI and sociology experts could do it.
The same type of problem is the capital sin of FB - they amplify bias
intentionally, selling us to the highest bidder.

------
themgt
A Google search for "American inventors":
[https://i.imgur.com/qw5kfQp.png](https://i.imgur.com/qw5kfQp.png)

A Google search for "American women scientists":
[https://i.imgur.com/o6mxG56.png](https://i.imgur.com/o6mxG56.png)

Can the media stop doing this thing where they p-hack reality for proof of
racist/sexist conspiracies?

~~~
jacquesm
You are missing the point: there is no reason why an innocent query such as
'black girls' should bring up a bunch of porn. And Google seems to agree with
this because they have since tweaked things so that that no longer happens.

~~~
ithilglin909
No, the point is that the author selectively presents information that
bolsters the point she wants to make.

I'm pretty certain that if she searched for "<blank> girls" \-- white girls,
asian girls, etc. -- with safe search turned off, a good number of the results
would be NSFW, because the internet is 75% porn (and 25% cat videos).

~~~
jacquesm
See the paragraphs titled 'beyond black girls' in TFA.

~~~
ithilglin909
Yes, I read the article. If anything, those paragraphs support what I just
wrote.

~~~
jacquesm
You wrote "No, the point is that the author selectively presents information
that bolsters the point she wants to make." and then claim the author supports
_your_ point?

At best you are supporting hers.

She made the case for the query 'black girls' and to pre-empt comments such as
yours bolstered her argument by extending it to a general case. So in no way
did she 'selective present information to bolster her point', she was as even
handed as she could have been (but apparently not even handed enough for
some).

~~~
ithilglin909
No, I'm saying that in the paragraph in question, she's half-admitting that
she is selectively presenting information. That is all.

------
chmln
Google is absolutely at fault here. They are much more involved in tweaking
search results than they admit.

They are especially slanted on an ideological basis. Try searching "American
inventors" \- nearly all the people listed are African Americans.

I doubt all of this is inadvertent, and its definitely dangerous, given the
power of Google. Today they are undoubtedly influenced by "left-wing"
ideology, but what about tomorrow?

------
crispyporkbites
This title is a bit misleading. The article opens with:

> My first encounter with racism in search was in 2009 when I was talking to a
> friend who causally mentioned one day, “You should see what happens when you
> Google ‘black girls.’” I did and was stunned.

but then later on:

> Although I focus mainly on the example of black girls to talk about search
> bias and stereotyping, black girls are not the only girls and women
> marginalized in search. The results retrieved two years into this study, in
> 2013, representing Asian girls, Asian Indian girls, Latina girls, white
> girls, and so forth reveal the ways in which girls’ identities are
> commercialized, sexualized or made curiosities within the gaze of the search
> engine. Women and girls do not fare well in Google Search — that is evident.

So it's not racism, it's sexism. Or is it? What would have happened if you
searched for black boys, white boys etc. in 2009? I bet it wouldn't have been
PG friendly...

------
Asdfbla
It's a difficult discussion because if you look at the literature regarding
algorithmic fairness, there really is no measure of equality that satisfies
all notions of fairness. If you force representation of seemingly
underrepresented patterns in the data then you implicitly engage in social
engineering.

This might be legitimate and reasonable, but you probably should be
transparent about the fact that you "unbiased" your data by externally
imposing a certain view of what the "fair" data should look like.

Ultimately it's a discussion for social scientists or laywers concerned with
discrimination, it's not really in the realm of being fixable by engineering
or computer science imho.

------
pishpash
Debiasing signals is a common procedure that signal processors are interested
in. Not saying it's an easy or well defined problem, or that the solution will
be legal, but dispassionately there is an engineering problem there to be
solved.

------
jpm_sd
Um, it's not "Google", it's "the content of the internet" that's biased -
right?

~~~
parrellel
If Google was still concerning itself with representing the content of the
internet, this would be the case, but increasingly Google is tailoring its
results for whatever other varied ends.

If it is curating the data, then how it curates matters. In this case, they've
curated most of the porn from the first page of results, from the looks of it.

Not sure how that makes me feel.

------
whiddershins
Without delving too deeply in to it, it seems google is reflecting existing
biases society-wide.

This is problematic in the same way as facebook’s echo chamber effect, that it
can cause a feedback loop that reinforces and heightens bias and division.

Beyond that though, I think it gets gnarly how much we want Google to _fix_
the world by tweaking their algorithms. We obviously want something
thoughtful. At the same time I don’t think we should lay all the world’s
problems at google’s feet.

------
fipple
I don't think there's a clear answer as to what Google SHOULD show for
"unprofessional hairstyles." The article shows a screenshot of it showing a
lot of normal-looking black women with natural hair, which of course shouldn't
be considered unprofessional. But it's definitely true that society has a
racist approach to black women's hair, and doesn't consider it "beautiful"
unless it's straightened. Hence nearly all famous black women straighten their
hair.

Should Google show black women's natural hair in the results for
"unprofessional hairstyles" to accurately reflect society's unfair attitudes
towards black women's hair? Or should they show actual unprofessional
hairstyles like giant mohawks etc.? Or should they show a message saying "No
hairstyle is unprofessional. Be who you want to be."?

------
beat
I want an algorithm to filter out any responses to this that talk about
"political correctness".

------
amrrs
There's an interesting concept - __Machine Bias __discussed in one episode of
You are not so smart[https://youarenotsosmart.com/2017/11/20/yanss-115-how-we-
tra...](https://youarenotsosmart.com/2017/11/20/yanss-115-how-we-transferred-
our-biases-into-our-machines-and-what-we-can-do-about-it/)

Developers who train these models or algorithms should be very well aware that
the training data is nothing but a download of a sexist, capitalist, racist -
society and that bias if is carried forward - irrespective of anything the
future isn't going to be clean!

------
evo_9
DuckDuckGo:

[https://duckduckgo.com/?q=black+girls&t=h_&ia=web](https://duckduckgo.com/?q=black+girls&t=h_&ia=web)

------
bcheung
Seems like the algorithms just surface already existing patterns in society.
They have no inherit bias of their own, they are merely reporting the reality
it sees.

Is introducing an anti-bias at the search engine level the appropriate
solution? It seems a bit too much like Western medicine alleviating the
symptoms without addressing the root cause. An anti-bias is just masking it.

Also, if the author is a feminist, why she writing for a magazine called
"Bitch"?

------
troupe
If you do a search for "three black teenagers" you now see all the top results
are screenshots of the mug shots that people saw when they previously searched
for the term. So basically the discussion around it makes it more prominent
not less. Not saying this is right or wrong, just pointing out the results are
basically being reinforced.

Now when you do a search for "three white teenagers" you mostly get mug shots
of black teens.

------
nkkollaw
This is ridiculous.

This just means that people on the internet who write about black teenagers
write about the ones who get arrested. People who create pages about "black
girls" create porn content, and most people who search for "black girls" most
likely are looking for porn and click on links that talk about porn.

It's amazing how people can see racism everywhere.

------
fipple
The world is an unfair place with a striking history of bias against black
girls, and Google is built with trillions of data points from this world.
Society thinks that black women's hair is ugly and white women's hair is
beautiful, and Google, with the power of their machine learning, reflects this
unpleasant societal truth back to the user.

------
tyingq
Searching for "Asian girls" or "Mexican girls", etc, produces similar results.

I can see the rationale for altering the algorithm for words like "girls".

It's also interesting that, for example, the results for "swedish girls" are
much tamer. Even though that's sort of a long standing phrase that was very
sexual.

------
pmarreck
With all due compassion, don't "black girls" generally have a hard-enough time
with broader acceptance, and this might be what is reflected in google search
results?

In other words, this is not Google's fault and is mostly just an ugly
reflection of our collective problematic values.

------
monktastic1
I see a lot of comments about whether Google results should be reflective of
society or unbiased, but little discussion of who gets to define what is
unbiased. Since that itself is a judgement call, I don't see how they won't be
accused of introducing their own bias.

------
viburnum
"We're only reproducing society's most harmful biases" is not much of defense.

------
venuur
Apologies for being slightly off topic.

While trying to read to article, my browser (Safari on iPhone) was redirected
to a scammy looking amazon gift card website. Did anyone else have this
problem? I am trying to figure out if it is a problem with Time.com or my
phone.

~~~
disgruntledphd2
Likely your phone, given that no-one else has complained.

Sometimes I miss the HN of 2011-12, where literally 80% of the comments were
about the design of the site, and how it didn't work on
Safari/IE/Firefox/Chrome/Lynx.

Not often, though.

------
lr4444lr
I'm often reminded of the Oscar Wilde quote:

"There is no such thing as a moral or an immoral book. Books are well written,
or badly written. That is all."

Replace "book" with "algorithm", and you have my sentiments on the issue.

------
danso
I thought it'd be worthwhile to see how DuckDuckGo handles things since it's
ostensibly the search engine _without_ personalization. And it has a different
scope and expectations from Google. One thing that comes to mind is how it
dealt with the "did the holocaust happen"-type problems by hard coding the
instant answers [a], something which Google doesn't admit to doing outright,
and wouldn't be scalable anyway.

DDG's results for "black girls" is pretty different from Google present-day,
though definitely a lot cleaner compared to what the author saw in 2009:

[https://duckduckgo.com/?q=black+girls&t=hb&ia=web](https://duckduckgo.com/?q=black+girls&t=hb&ia=web)

(sidenote: it's not clear if the author disabled SafeSearch in her 2009 search
or not. SafeSearch was definitely part of Google by then, and I can't imagine
"sugaryblackpussy.com" getting past that filter. If you turn DDG's Safe Search
to "Moderate" \-- "Off" returns the same results as "Strict" \-- you will get
a lot of very NSFW results. I guess this sidenote raises a new set of issues
about the author's methodology but will ignore it for the sake of brevity
here.)

The first result is an article titled "Black Girls Only" in Ebony magazine
[0]. Which isn't a bad article (or publication). Maybe people would object
that the first result is actually about sexualizing black women (albeit
positively). The 2nd result is a lot less promising: "Hot Black Girls (45
pics)" at acidcow.com, which has a higher Alexa ranking [1] than Ebony.com
(22K vs 63K), but basically looks to be a clunky imageboard.
blackgirlscode.com is #6, followed by blackgirlsrun.com (a running club). The
rest of the top results are black girl image sites (photobucket, a Facebook
group for Big Beautiful Black Girls). The most notable difference between DDG
and Google Results, besides what's #1, is that Google results have a lot more
news articles in which "black girls" are in the headline (via NPR, nytimes,
and theroot). DDG is a lot more sporadic in comparison

DDG for "white girls" is not terribly better in terms of being female-
friendly:

[https://duckduckgo.com/?q=white+girls](https://duckduckgo.com/?q=white+girls)

The first result is to the Amazon listing of a book titled "White Girls", by a
well regarded New Yorker critic [2]. But the #2 result goes to
urbandictionary.com, and #3 hilariously goes to the Wayans Brothers' classic,
"White Chicks". There's a bunch of results relating to "White Girl", the
mediocre 2016 film (Wikipedia, rogerebert.com, rottentomatoes.com, etc). And
then a bunch of results about white females with men of other races.

[a] [https://github.com/duckduckgo/zeroclickinfo-
goodies/blob/b7a...](https://github.com/duckduckgo/zeroclickinfo-
goodies/blob/b7a73754210185594d945716aedde5cc8486fcba/share/goodie/historical_events/events.yml)

[0] [http://www.ebony.com/news-views/black-girls-
only-503](http://www.ebony.com/news-views/black-girls-only-503)

[1]
[https://www.alexa.com/siteinfo/acidcow.com](https://www.alexa.com/siteinfo/acidcow.com)

[2] [https://www.amazon.com/White-Girls-Hilton-
Als/dp/194045025X](https://www.amazon.com/White-Girls-Hilton-
Als/dp/194045025X)

------
oneplane
It's interesting how the article tries to 'blame' an algorithm while it's
basically just working fine and simply reflecting the output of
culture/country/people. It's not that the software decided that one thing
should be the results, and others should not, it just takes what we feed it,
and gives it back to us.

~~~
Kequc
I find the concept fascinating that anyone would think an algorithm is
"racist". More or less begging human interference with the algorithm in order
to make it less racist, anything anyone does can then be viewed as racist.

~~~
danso
"Racism" involves prejudice and discrimination predicated on the belief that
one race is superior/inferior to another. If a face-detection algorithm fails
to easily recognize black or Asian faces, such that people of that
ethnicity/race have to go through a more manual/friction-laden process (think
manual pat-downs and searches from TSA) to be verified as "human" or
"citizen", then how is that algorithm _not_ responsible for discriminatory
treatment?

I do agree, though, that it is wrong to infer that the algorithms _themselves_
are evil or malicious -- it's possible to be racist without having negative
intentions.

But what does it say about a society if it continues to optimize algorithms
(and related infrastructure) for one group over another?

~~~
oneplane
Of course those systems 'discriminate', that is what they are designed for.
The problems aren't with the system, but with how they were set up,
implemented or tested.

Say you make a sensor that is supposed to tell the difference between blue,
green and purple, but you only test with one shade of blue and maybe two
shades of green, you are going to have trouble actually matching your design
goals.

In the case of the face detection system: they didn't specify and/or test is
well enough, which can be due to a number of factors, but will most likely lie
with the employees of the company that did the development. If they only have
the classic 'pasty white guys' to work with, then it's going to be crap at
actually doing face detections for all humans. On one hand you could setup a
proper test protocol, on the other hand they shouldn't have taken broad or
vague terms when developing/presenting the technology. If you don't have a
broad selection of faces to test with, you shouldn't claim you have 'face
detection', since you merely have 'detection of faces of the people that work
on the project plus anyone who looks like them'.

This would be a completely different story if someone writing the face
matching code specifically programmed code or wrote configuration data that
targets skin tone or geometry of specific groups of people.

Some people would like to extend this type of technological issue into the
area of HRM and race/gender-bias in society in general, but that is not a
technology-only discussion and hardly something the people involved are
qualified to argue about.

Also: >But what does it say about a society if it continues to optimize
algorithms (and related infrastructure) for one group over another?

It says that society is imperfect, and that certain levels of xenophobia, bias
and true racism exist. Doesn't say much about technology though.

~~~
danso
> _If you don 't have a broad selection of faces to test with, you shouldn't
> claim you have 'face detection', since you merely have 'detection of faces
> of the people that work on the project plus anyone who looks like them'._

Yeah, but that's what happened here with HP in 2009. I'm not a huge fan of
their products these days but I don't think they would intentionally be
deceptive here, i.e. I think there's a lot of room to blame incompetence
before malice. If HP is a company with very few black employees, this kind of
consideration may be completely off their radar. It's super unfortunate, but I
don't see the company as evil or maliciously racist, per se (I think we can
skip retreading the hiring for diversity debates for now).

> _This would be a completely different story if someone writing the face
> matching code specifically programmed code or wrote configuration data that
> targets skin tone or geometry of specific groups of people._

Why does it matter? What's the difference between an algorithm that fails to
perform because of programmer incompetence, or programmer malice? What's the
difference to the end-user if the programmer was plain ignorant of good
testing coverage, vs. a programmer who thought _" Fuck it, minorities are a
minor part of our user base. Not worth the extra engineering effort!"_?

Technology is an unavoidable part of the problem. Because it is the technology
that allows us the power and freedom to create and apply scalable algorithms
to machinery and computers. This automation allows for efficient and reliable
decision-making, and we as a society decide where that automation is
appropriate and worthwhile, i.e. where human agency is no longer needed.

But technology and its fundamentals are still a key factor. Creating a multi-
racial face classifier is fundamentally more work and difficulty than one
trained for just one race. The math and physics are unavoidable. And every
engineered system and product has to make tradeoffs between production cost
and feature set.

In the case of the light-skin-optimized HP web cam, I think it's important,
and fine, to call it "racist" \-- a black HP customer will have an inferior
experience fundamentally because he is racially black. But this isn't just a
way to quickly assign blame. Recognizing that tech is fundamentally limited is
the first step in understanding that systemic racism (e.g., all the decisions
that led to the "racist" camera) could be a contributing factor to the
camera's substandard performance.

Much harder to get to that thinking if we have a mentality of, "how could the
computer be wrong/flawed?"

------
johnnyOnTheSpot
They have a lot fewer crayons in their box than they like to project.

------
interfixus
_Why are white men stockpiling guns?

Why are so many white men so angry?

White men must be stopped: The very future of mankind depends [...] The future
of life on the planet depends on bringing the 500-year rampage of the white
man to a halt. For five centuries his ever more destructive weaponry ..._

My first three DDG-results on searching ' _white men_ '.

I shall now proceed to get over it. Suggest others do likewise.

------
ghufran_syed
I'm not sure how the fact that many web users, many of whom are white, find
black girls attractive, is somehow construed to be "bias against black girls"?
Unless your job is to see bias everywhere...?

Oh.

------
daenz
This article feels like bait for smart people to ruin their careers with
honest responses.

~~~
erikig
I certainly hope not, there are a lot of great comments above that read as
both honest and smart.

------
komali2
> “p-ssy,” as a noun

You are allowed to say fuck on the internet.

Also, does it really count if you skip out _one_ letter? I never understood
this. You're still all but saying the word. Nobody is being protected here
lol. A kid can cycle through the five vowels in about half a second to figure
out what the word is, and one of them is also kinda a swear - pissy.

EDIT: "n 2012, I wrote an article for Bitch Magazine" now I'm very confused.
It's ok if it's a proper noun, I guess?

~~~
grzm
Different publications have different style books. This is Time, which, given
it's history as a mainstream magazine, likely has a more conservative
editorial style than more recent, online-only publications.

~~~
komali2
I believe you, I'm just curious why the style guide ever said "swear words are
fine as long as you blank out a single letter."

Meh, likely the answer is unknown, mostly I'm just commenting

------
SolaceQuantum
Do we know if Google or tech in general hires black women less than
proportional to the US population?

~~~
ceejayoz
We do, yes.

[https://www.usatoday.com/story/tech/news/2016/06/30/google-d...](https://www.usatoday.com/story/tech/news/2016/06/30/google-
diversity-numbers-2016/86562004/)

> Women made up 31% of Google employees in 2015, up one percentage point since
> 2014, according to statistics released by the Internet giant on Thursday.
> One in five technical hires were women in 2015, raising the number of women
> in technical roles to 19% from 18% in 2014 and 17% in 2013. In 2015, women
> held nearly a quarter of leadership posts at Google, up from 22% in 2014 and
> 21% in 2013.

> Google says it's also hiring more black and Hispanic workers: 4% of hires in
> 2015 were black and 5% were Hispanic. Hispanic employees in technical roles
> increased to 3% from 2%. But the increased hiring did not budge the overall
> percentage of underrepresented minorities in the Google workforce as total
> hiring rose, with Hispanics making up 3% of the work force and African
> Americans 2%.

Given that women are half the population, and African Americans and Hispanics
each make up about 12%, we can state categorically that Google's hiring
doesn't match the general US population.

~~~
UncleEntity
> Given that women are half the population, and African Americans and
> Hispanics each make up about 12%, we can state categorically that Google's
> hiring doesn't match the general US population.

But if you account for demographics _within the industry_ do the numbers
remain biased?

For example: men account for less than 10% of nurses so if you were to go to a
hospital with 50% male nurses they would be _way_ overrepresented.
Statistically speaking of course, I'm in no way trying to give the impression
we need to do something about the lack of diversity within the nursing
profession.

\--edit--

...or maybe we need to do something about the lack of diversity in the nursing
profession? Don't actually know how this ball rolls?

~~~
SolaceQuantum
From what I know there is push for more male nurses and some issues with
doscriminatory attitudes towards male nurses.

------
sean_anandale
Now do "white couple", or "american inventors".

------
tiredwired
> In 2012, I wrote an article for Bitch Magazine

Seems ironic.

