
Facial recognition database used by FBI is out of control - nkristoffersen
https://www.theguardian.com/technology/2017/mar/27/us-facial-recognition-database-fbi-drivers-licenses-passports
======
Eyas
> Database contains photos of half of US adults without consent, and algorithm
> is wrong nearly 15% of time and is more likely to misidentify black people

I think technology is often a very effective way to describe what "systemic
racism" is.

Racism propagates easily, and race-blind / race-agnostic / race-indifferent
behavior does not combat it or counteract it.

Imagine a majority-white tech team, largely color-blind, yet still (by virtue
of economic and geographic segregation) in a mostly white-centric bubble. The
training set might be largely white. If some preprocessing or feature-
detection is taking place on the raw images, it might be tracking features
that could have higher variability among white people while overlooking other
features.

If you don't know to go out of your way to check for, include tests, and give
thought to mitigating imbalances caused by race (because you "see no race")
then you allow this problem to propagate. In the case of computer vision, you
will inherit the "racism" of every library you import from.

This reminds me of gender biased found in Word2vec[1]. It is hard to accuse
the dataset of being sexist, especially when viewing it as a reflection of
language in news article. Yet, people using Word2vec use it as a
representative of language, not a representative of language in news, with all
the baggage in today's (and past) culture. As a result, sexism can propagate,
autocomplete and query suggestions might be biased. Then spending habits might
be biased, etc.

    
    
      [1]: https://www.technologyreview.com/s/602025/how-vector-space-mathematics-reveals-the-hidden-sexism-in-language/

~~~
sundvor
In my naive mind, I can't help but think it might simply be a case of the
contrast ratio being greater in lighter skinned people - therefore making the
hit ratios higher. Greater contrast ratios will make facial features stand out
more.

~~~
handedness
Exactly my thinking. I worked with a company that developed on a medical
device that took subcutaneous readings extracutaneously, and despite being
developed almost entirely by Asian engineers, it got the best data quality
from fair-skinned Caucasians, and the worst from dark-skinned Africans, as it
totally relied on the transmission of light through skin tissue.

Claiming racism without understanding the problem is both unfair to the people
involved, and an unwise undermining of one's own credibility for the times
when people really are acting in bad faith. That may be in this case, but that
may not be, and I don't like guilty-until-providen-innocent no matter whom
it's leveled against (even a Federal agency of which I'm not the world's
biggest fan, and a database the likes of which make me think we need a
constitutional amendment to protect the privacy of the citizens).

~~~
mambodog
Even if it was no-one's intent that the system is racially biased, allowing
such a system to continue operate after learning of such a bias is a racist
act.

~~~
handedness
Ignoring known technical deficiencies in the system and assigning the same
confidence levels in the output regardless of race could work out to be pretty
racist, but what are they supposed to do, manually cripple the accuracy of the
system down to the lowest common denominator?

In the medical device I mentioned above, it was medically useful for
caucasians and lighter-skinned Hispanics, so-so with Asians, and almost
useless for Africans.

So, what–don't go to market until the technology has parity across all skin
tones?

Not everything is about intent.

~~~
bilbo0s
I hate to be a pedant here...

but I actually wrote a pretty successful 3d radiology/cardiology package. So I
understand medical devices.

I just wanted to hop in here and say that there is NO WAY the FDA would give
you clearance to use the device you describe on African Americans under the
circumstances that your posts have postulated.

It seems a no brainer that the facial recognition software should not be used
on African Americans either. At least, if it was MEDICAL software... it would
DEFINITELY NOT be cleared with that kind of a failure rate. And any provider
or delivery network using the facial recognition software on African Americans
would be on his or her way to prison.

But the regulations on legal software may be more lackadaisical than the
regulations on medical software. Getting things exactly right may not be as
important. And the penalties for misuse of medical devices are, without
question, far more draconian.

~~~
handedness
> I just wanted to hop in here and say that there is NO WAY the FDA would give
> you clearance to use the device you describe on African Americans under the
> circumstances that your posts have postulated.

The FDA has approved drugs that are specifically intended to target medical
issues in specific ethnicities, so I'm not sure about your assessment. That
said, this product's market was outside the scope of the FDA, so it's
irrelevant.

> It seems a no brainer that the facial recognition software should not be
> used on African Americans either.

Any data source which produces data that's better than a random guess should
be considered when trying to solve a problem. That it offends some people's
social sensibilities is a separate issue.

Look, I don't like the mere existence of the FBI's system, and I don't think
the public's continued lack of concern for placing constitutional restraints
on systems like it is a wise course forward, but to say it's a "no-brainer"
when it might produce valuable leads is to focus on one aspect of the problem
without considering the rest of it.

Just as it would _also_ be a mistake to consider it more accurate than it is
when trying to identify dark-skinned suspects. Witness accuracy is not a new
problem.

It's also worth noting that the human eye is just as bad at facial recognition
when the light and/or contrast is lower.

~~~
bilbo0s
Not a matter of social sensibilities man...

the system doesn't work. At least not for african americans. You can't
"socialize" away the fact that it doesn't work.

Use it for the races that it works for. DON'T use it for the races it does NOT
work for. That is common sense... also known as a "No Brainer".

~~~
handedness
If it's better than a random guess–which it is, majorly–then it could
constitute a valid data point in narrowing a search. There's nothing racist
about using it to the level of accuracy it has demonstrated. If the system
worked to the same degree of accuracy on Caucasians as it does on Africans
nobody would be calling it racist, they'd be calling it a tool that can help
build a list of potential suspects to further investigate.

To argue anything else is making it a matter of social sensibilities.

~~~
bilbo0s
Only it's not better than a random guess.

Because in Eyewitness identification, you're never truly dealing with "random"
guesses. Again, my experience is on the medical side of this whole thing. I
know, and understand, the rules as my company is obliged to play by them. And
there is just no question that we would not be allowed to use that system on
african americans. Review boards would look on it the same way you might look
on someone using leeches in the middle ages.

Only worse... because we KNOW it doesn't work.

------
abandonliberty
I wrote a class paper about the threat to privacy/freedom from data mining and
advanced algorithms in 2003.

Unfortunately the risks they highlight exist without facial recognition. It's
interesting to see facial recognition as a point of debate because the battle
has been largely lost.

The quantity and intimacy of personal data already captured is much more
damaging. Amazon figured out a teen girl was pregnant in 2012 [0]. IMO face
recognition isn't even the most invasive modern technology. Sure it helps
close the loop on a few Luddites but most of us willingly handed over the keys
to our lives long ago.

[0] [https://www.forbes.com/sites/kashmirhill/2012/02/16/how-
targ...](https://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-
figured-out-a-teen-girl-was-pregnant-before-her-father-did/#269d6b256668)

~~~
theparanoid
Appropriate username. Ditching credit cards and smartphones are the best ways
of increasing privacy.

------
cortesoft
Wow, calling the system 'race-blind' is the stupidest thing I have heard in a
while. Of course the computer has no conception of race, but that has no
bearing on whether the system is accurate for all skin shades.

~~~
nkristoffersen
Curious how advanced the system is too... Does it use deep learning ML like
the current state of the art?

~~~
sushirain
The article says 5 years old, so it's probably not deep learning.

------
joshontheweb
Our privacy is under attack on all fronts by our own government. It seems hard
to imagine a feasible way to defend against this. Even well thought out
legislation will likely just be ignored with impunity. Is privacy of any sort
a lost cause at this point?

Edit: forgot a word

~~~
dukeluke
Pretty much. There are things you can do, but the majority of people won't do
them. Our political system is showing its age, being completely gerrymandered,
bought by major corporations, and is incapable of changing unless it benefits
the rich in some way.

------
nkristoffersen
Do city police departments have access to the FBI facial database or do they
have their own?

~~~
mtreis86
Many have had access since it's inception three years ago
[https://www.eff.org/deeplinks/2014/04/fbi-plans-
have-52-mill...](https://www.eff.org/deeplinks/2014/04/fbi-plans-
have-52-million-photos-its-ngi-face-recognition-database-next-
year#footnote2_eur8305)

Section 4.1 of this [https://www.fbi.gov/services/records-
management/foipa/privac...](https://www.fbi.gov/services/records-
management/foipa/privacy-impact-assessments/interstate-photo-system) shows
that state and local authorities are not the only ones with direct access.

~~~
nkristoffersen
Thank you for these links. Very interesting stuff. Curious how good these
systems are compared to the latest visual machine learning tech...

------
wallace_f
This is absolute madness.

Where this--and other troubling revelations from both the Surveillance State
and the Deep State--leads us to is authoritarianism.

Once we're there, nothing will matter anymore. It will be the end of Western
Civilization as we know it.

~~~
Sunset
"Western Civilization" has operated as Imperialist and Totalitarian regimes
for centuries. "Modern" democracy is the exception, not the default mode of
our societies.

~~~
wallace_f
To be fair that is misleading. Even in the most extreme cases, Feudal Europe
arguably struggled to maintain authoritarianism as it was often government and
military power was decentralized. Let's be honest though, I was never talking
about feudal Europe. We're talking about America, and yes, modern democracy
that built the Western world. Your characterization of it as always
authoritarian is plainly erroneous.

------
jopolous
Why are these representatives so upset about this when they just passed
resolution 34?

~~~
civilian
Maybe because the privacy rules don't make as much sense anymore?
[http://thehill.com/blogs/pundits-
blog/technology/325708-sena...](http://thehill.com/blogs/pundits-
blog/technology/325708-senate-was-right-to-block-fccs-broadband-privacy-rules)

I don't feel strongly either way, I just think that there is more nuance
around the issues of privacy.

~~~
throwaway729
I don't have a choice of ISP.

I can (trivially!) choose not to use Google's services, including their DNS
servers.

------
blazespin
Yeah, imagine your Trumpista ICE agrent walking down the street with a camera
that flags people who aren't in the database (ie, no valid driver's license or
passport). Welcome to 2017

~~~
jrockway
This doesn't mean much. They can flag you for whatever, but not having ID
isn't a crime, so a flag is the most they can do. Let's say they think that
not having an ID means you're an illegal immigrant. They have to prove that in
court, while you don't have to prove that you aren't.

~~~
wavefunction
Police can do whatever they want. Here's an article about several people who
were arrested (and booked into custody) for "failure to identify"
themselves[0]

[0][http://www.citylab.com/crime/2014/02/yes-police-can-
arrest-y...](http://www.citylab.com/crime/2014/02/yes-police-can-arrest-you-
refusing-identify-yourself/8485/)

~~~
beezle
As noted in the article, its only a violation if you fail to identify yourself
after being arrested. What is not clear to me is whether it is sufficient to
just give your name? I often go out without my wallet and I'm not sure why
there would be any legal presumption that a citzen would produce some type of
government document to serve as "ID".

~~~
DINKDINK
"Prove that you're a citizen!, I suspect you of entering the country without
state approval!"

