
Facebook Ads Can Still Discriminate, Despite a Civil Rights Settlement - sapski
https://www.propublica.org/article/facebook-ads-can-still-discriminate-against-women-and-older-workers-despite-a-civil-rights-settlement
======
Excel_Wizard
> Nevertheless, the composition of audiences can still tilt toward demographic
> groups such as men or younger workers, according to a study published today
> by researchers at Northeastern University and Upturn, a nonprofit group that
> focuses on digital inequities

>One reason for the persistent bias is that Facebook’s modified algorithm
appears to rely on proxy characteristics that correlate with age or gender,
said Alan Mislove, a Northeastern University professor of computer science and
one of the study’s co-authors.

Hypothetically, let's say that the trucking company in the article used
"people interested in cars" as a targeted group. It would come as no surprise
to me if this group was > 80% male.

It may even be by another mechanism- facebook's algorithm may look at the
profiles of individuals that clicked through on ads in order to determine who
to show the ads to in the future. This is a good way to provide cost effective
advertisement. This also may be done in a way such that a small fraction of
these ads are still shown to $membersOfProtectedClassX, even in cases where
said class is statistically unlikely to click on the ad. What small fraction
is necessary to be legally unproblematic?

If Joe Schmoe creates a facebook ad to hire for a bricklaying job (a job which
is 98% male), what percentage of those ads must be served to women to be
legally compliant?

~~~
ajross
> facebook ad to hire for a bricklaying job

That's a straw man. The situation of concern is failing to advertise high
value or high status products and opportunities to people who would be
disadvantaged. So to stay with your analogy, if you have a job to hire a React
developer (a job which is right around 90% male), but you _don 't show it_ to
someone because their browsing history includes Pinterest but not Reddit, then
you're discriminating. And that's a problem worth worrying about and trying to
address. Likewise ads for vacation timeshares that go to Taylor Swift fans but
not to Diddy afficionados.

Yes, it's possible that there may be some collateral damage in the bricklaying
recruiting industry. And, sure, maybe that's something that needs some
regulatory relief. But mostly I think you're just looking for an example here.

~~~
throwaway1777
Who's to say today's bricklaying job is tomorrow's "high status" coding job?

------
cm2012
This is beyond stupid. What they're saying is that even if you target people
who write "I love programming" on their fb page, it's still discriminatory
towards women because men are more likely to write those words.

~~~
ajross
> What they're saying is that even if you target people who write "I love
> programming" on their fb page, it's still discriminatory

That is... not what they're saying. In fact the article doesn't claim to know
the targetting mechanism at all (though it turns out that they found a way for
some advertisements to circumvent the new restrictions on gender and race
targetting). They're just claiming to have found ads that empirically _are_
targetting demographics in ways that Facebook has already agreed not to.

The "I love programming" bit seems to be something you've invented.

~~~
cm2012
"Dolese’s ad, for example, could have reached a predominantly male audience
because it featured a man, or because an interest in trucking acts as a proxy
for maleness, or both. (A Dolese spokeswoman said the ad targeted categories
“that would appeal to someone in this line of work.”) The settlement did not
resolve the potential bias from proxies and ad content, but said Facebook
would study the issue."

The "I love programming" bit is an example of the kind of proxy they're
talking about. Something correlated with gender but not gender itself. The FB
algorithm uses thousands of data points that might correlate in different
ways. Saying that any of those variables that correlates with gender should be
forbidden is crazy.

~~~
ajross
> Saying that any of those variables that correlates with gender should be
> forbidden is crazy.

Once again, _they are not saying that_. You have applied a maximalist
interpretation to the article that simply isn't present in the text.

The point of the article is that the end result is still discriminatory,
something that Facebook had promised to fix. And they didn't fix it, and
that's newsworthy.

Your point seems to be that solving the problem is really hard, so we
shouldn't try to solve it, nor talk about whether or not it's being solved by
parties who have promised to try to solve it?

------
stickfigure
This is silly. By definition, if you're advertising on Facebook, you're
selecting for a certain demographic. It's a bit broader than the AARP but by
picking any medium you are discriminating against the people who don't follow
that medium.

It's also weird to say you shouldn't be able to target certain demographics.
An ad that resonates well with seniors might resonate poorly with millennials.
You might want to attract both! So you run multiple ads. Or you target
youngsters on Instagram and oldsters in the NYT.

This is not something Facebook is in any position to police.

~~~
wahern
I was curious about the source of Facebook's liability. It turns out their
liability stems from the Fair Housing Act (FHA), 42 U.S. Code § 3604(c)
([https://www.law.cornell.edu/uscode/text/42/3604](https://www.law.cornell.edu/uscode/text/42/3604))
which has been repeatedly interpreted over the years to apply to publishers
directly.

Facebook has a duty to screen housing advertisements for discriminatory
indications or intent. The question of whether this sort of disparate impact
discrimination meets the criteria for 3604(c) is a different matter. (My
uninformed guess is that it does _not_ , but Facebook is being attacked on all
sides so will probably be judicious regarding when and how hard it pushes
back.)

------
unlinked_dll
* housing, employment, and credit Ads

I think the easiest solution would be to disallow ads of those categories on
their platform. I'd think the risk of "facebook/instagram is racist" damaging
their brand and the cost of federal discrimination lawsuits would outweigh
whatever revenue they project.

As an aside, I know it's faux pas to bring up any observed (and/or presumed)
differences between the protected classes - but _maybe_ (just _maybe_ )
Facebook's targeting is smart enough to correlate "most likely to care" about
things that tend to have skewed demographics without looking at the
demographic data itself. Like the example in of truck driver ads targeting
men, what is Facebook using to determine who they target? And do those data
points line up with demographics?

I don't know, but these kinds of systems are tough to introspect from the
outside.

~~~
strbean
Your aside is pretty much dead-on the big ethical issue with bias in ML right
now.

For example, ML can do quite a good job of predicting recidivism rates in
convicts, and justice systems have been using this to aid in sentencing and
parole hearings. Obviously, these ML approaches are not supposed to consider
ethnicity. So the factor that ends up having the greatest weight is "did your
father / grandfather spend time in prison", which is an extremely effective
proxy for "are you not white".

Basically, when your training data is based on a reality already heavily
influenced by bias, your models will end up reflecting and perpetuating that
bias.

~~~
AnthonyMouse
The real problem is that there is an actual racial disparity in recidivism
rates, so an algorithm that makes accurate predictions will predict the racial
disparity that actually exists. There is no way to solve that without
significantly impairing the accuracy of the predictions -- which is to say
releasing convicts who we know have an unreasonably high probability of
recidivism merely because there were too many other convicts with an
unreasonably high probability of recidivism who were the same race.

You can also imagine what happens if you apply this recidivism "adjustment" to
gender, which causes a lot of the people advocating it in the case of race to
become nervous and defensive.

~~~
edmundsauto
Accuracy is not the top objective in these systems, fairness is.

~~~
s1artibartfast
In this example, what is fairness, if not the most accurate prediction
possible?

~~~
unlinked_dll
The effect of its use on policy.

~~~
s1artibartfast
That is incredibly vague. What effect and what policy?

------
whiddershins
I can’t understand, in succinct plain terms, what the desired outcome is here.

What would it look like if the ads were not biased, or discriminating, in any
bad way.

~~~
munk-a
Maybe it'd just look like it looked in the 90's, there wouldn't be targeted
advertising and everyone except the marketers would be happier.

~~~
philwelch
There was targeted advertising in the 90’s. A lot of it uses direct mail, and
profiled consumers based upon their magazine subscriptions. But other things
like grocery store loyalty cards and the like were used back then, too.
Television, radio, and print advertising was and is targeted by demographic.

~~~
ng12
Literally the entire purpose of the magazine industry is targeted advertising.
The only reason to publish something like Cosmo or Hustler is to sell adds
targeting the types of people who read Cosmo or Hustler.

------
Koremat6666
Targeted ads by definition are discriminatory. In fact the best ads are the
ones that discriminate perfectly based on wishes of the advertiser.

~~~
bduerst
The perfect ad is one that gives the consumer exactly what they want from the
supplier. Ads solve the asymmetric information problem on imperfect markets,
which require information to be efficient.

------
duxup
>proxy characteristics that correlate with age or gender

Isn't that just how humanity, is?

Some things appeal to some demographics more than others?

I'm not sure anyone can prevent that.

