
Harvard Professor Finds Racial Bias in Google AdSense Results - jamesbritt
http://www.geekosystem.com/google-results-reflect-racism/
======
lutusp
A quote: "She says there is only a one percent possibility her findings are
based on chance, and would like to see technology used to prevent these kinds
of biased search results."

Translated into scientific jargon, this is a p < .01 value, or the statement
that the result has a 1% probability that it could have arisen by chance and
therefore mean nothing.

The problem with this class of p-factor is that it often offers false or
dubious assurance that the result actually has meaning, for reasons given
here:

<http://www.ics.uci.edu/~sternh/courses/210/cohen94_pval.pdf>

Quote: "After 4 decades of severe criticism, the ritual of null hypothesis
significance testing—mechanical dichotomous decisions around a sacred .05
criterion—still persists."

One obvious criticism of this sort of "discovery" is that Google's search
software tries to produce AdSense results based on what a person has searched
for in the past -- therefore, even if an unscrupulous advertiser isn't
deliberately targeting people with specific names, the unbiased core
algorithms may well produce associations that may seem racist but are solely
based on past search entries.

Another criticism of a study like this, more classical, is that, to achieve a
1% p-value, one need only test 100 random associations, each with a
probability of 50%, pick the best one, and publish it. This is called "data
mining" and by using it, a diligent worker can almost always come up with
something:

<http://boingboing.net/2010/12/20/creating-a-phony-hea.html>

[http://rationalwiki.org/wiki/Correlation_does_not_imply_caus...](http://rationalwiki.org/wiki/Correlation_does_not_imply_causation)

