
America is turning against facial-recognition software - pseudolus
https://www.economist.com/united-states/2019/05/25/america-is-turning-against-facial-recognition-software
======
briatx
> Earlier this year the Washington Post reported that many small police
> departments were abandoning body-worn-camera programmes because of the cost.

> Although the cameras are cheap, officers can generate 15 gigabytes of video
> per shift; storage costs mount. Police unions often oppose body-worn
> cameras, fearing they imperil their members by giving superior officers
> licence to search them for punishable behaviour.

>Other officers complain about the amount of time required to review and
redact footage in response to public-information requests.

Sure, the Police themselves don't like the cameras turned on them. Suddenly it
is "too expensive".

Then they will turn around and argue for ubiquitous CCTV camera installations.

~~~
PunchTornado
would you want a body camera at your work? You wouldn't I'm sure. Don't blame
them for looking for their interest. It's someone else's job to (politicians)
to legislate the use of it.

~~~
antisthenes
What a poor analogy. My job doesn't have anything to do with being the sole
arbiter of violence and defusing dangerous situations at the trust and expense
of taxpayers and local citizens.

If they don't want accountability as police officers, they can go find another
job.

> Don't blame them for looking for their interest.

I can, I will and I will argue that it is my moral obligation to ensure that
the public servants are acting in the best interest of the public as a
citizen. But thanks for your concern.

~~~
alexryan
"giving superior officers licence to search them for punishable behaviour"
seems like a legitimate concern. You seem to be assuming that the "superior
officers" will have "the best interests of the public" as their sole
motivation. In my experience that is rarely true of ladder climbers. By not
dealing with this concern you could easily end up making things worse.

~~~
a1369209993
If "punishable behaviour" is appropiately limited to things like assault and
evidence planting, then this is irrelevant, and superior officers should even
be _required_ to preform such searches.

If "punishable behaviour" includes things like slacking off to stop by dunkin
donuts, the problem is with what qualifies as punishable behaviour, not the
ability to search for it.

~~~
alexryan
You might be missing the point. Assuming that all the good guys are
supervisors and all the bad guys are the supervised is an unwise assumption to
make. The bad guys can simply become supervisors and weed out the good guys.
Entire police departments have been corrupted in this fashion. If you over
simplify the problem the solution is unlikely yield the desired result.

~~~
a1369209993
No, I'm saying that even if all the supervisors are bad, the ability to search
for punishable behaviour only helps them if there exists punishable behaviour
for them to find.

I agree that this is a unreasonably high standard to hold people whose job
doesn't have anything to do with being the sole arbiter of violence and
defusing dangerous situations at the trust and expense of taxpayers and local
citizens to.

If they don't want accountability as police officers, they can go find another
job.

Edit: unless you meant not supervisors but the policymakers who define
"punishable behaviour", in which case getting rid of body cams is about as
helpful as rearranging the deck chairs on the Titanic.

~~~
alexryan
No good person wants criminal police officers to not be held to account but
simply passing laws is unlikely to have the desired effect. Many police
departments in the united states are already highly corrupted at the highest
levels. Its not as bad as it is in Mexico where the drug lords have actually
taken over, but its getting close.

------
xena
Good. This kind of software is a mirror to the data fed into it. People on the
edges of society (that the people making the mirror might either intentionally
or accidentally ignore) will get fucked over tremendously by this technology.
We should put a complete ban on it. Even if it means gimping my photo
organization programs.

~~~
diafygi
Just so I have good arguments when talking to someone about this specific
point, can you or someone else please go into more detail on why this will
happen?

~~~
bryanrasmussen
There have been numerous issues where corpora of peoples faces have been
insufficiently stocked with a particular racial group, in such cases people's
faces might not register as faces and therefore be shut out of whatever the
facial recognition is supposed to implement.

For example facial recognition at a high priced building, suddenly non-white
people can't get in because you did not show a real face at the scanner.
Sorry, it's a bug not a feature.

The same problem would of course apply to anyone with facial deformities etc.

The problem would be essentially of two forms - Automatic exclusion from a
group by facial feature, or automatic inclusion in a group by facial feature.

Think of the automatic inclusion as advanced phrenology or other sort of woo,
in this scenario company X sells the idea that they will have facial
recognition of 'criminal types' based on whatever research they can pull out
that criminals often correspond in appearance. Then they fill that corpus up
with criminals which then is a nice self-selecting system of find the
black/hispanic person and deny them the job, loan etc. because of likelihood
for crime past the threshold set.

the reason why this would happen is the same reason why banks used to deny
loans to people in certain zip codes, because by doing so they could be
racially profiling and excluding but pretend not to do so.

~~~
mc32
>”corpora of peoples faces have been insufficiently stocked with a particular
racial group, in such cases people's faces might not register as faces..”

I’m not sure this is s good argument against the technology because there is a
solution to this objection: they’ll improve it and ensure every face type is
comparably recognized. It should be rejected on more basic principles.

~~~
rmrfrmrf
It's not as simple as "just make sure it works on everyone". Even if it were
physically possible to capture the face of all 7 billion people on Earth,
you'd start running into issues of false positives/negatives, overfitting,
etc., and furthermore it would basically require every person to be
periodically scanned from birth until death to ensure that the algorithm can
recognize new faces.

~~~
mc32
This is a different objection than the original objection. (effective accuracy
vs specific bias).

With a pervasive system, unless people are hermits, they’ll get scanned
periodically, and with other bits of information the changed face can be
correlated with the same person (I go into a salon looking one way, come out
different, go into a boxing match, come out different, but I’m following my
routine and go to the same subway stop and convenience store and use the same
payment method, etc.)

------
johnrbent
There is a lot of sensationalism (see title of article) and high, anxious
emotions around facial recognition right now, which really is to be expected
at this point. But it is muddying the rational conversations we should be
having about it. Can Amazon lessen Rekognition's bias towards white people
(ie. bright pixels)? Yes. Get over that. This article completely whiffed on an
important point regarding police use of facial recognition (or maybe I missed
it) and that is _police_ bias. We are talking about one of the most culturally
and racially biased concentrations of power in the country. It's important to
note that they will NOT be training the models or writing the algorithms that
drive their tools. We have a chance to let some brilliant engineers create
deliberately unbiased tools that can only improve the situation in America's
police force. PLEASE do not let your fears win. It will be used for evil and
it will be used for good. That is not something that should be banned. Sorry I
started getting a little emotional.

~~~
Loughla
The problem I have with your statement is that it seems to put 'brilliant
engineers' on some kind of magical pedestal.

Engineers, much like police, are humans. Therefore they hold biases, they can
be short-sighted, and they may not understand many of the long-term
ramifications of their jobs.

All this does is move the power and bias up the chain from police to engineers
while simultaneously making some very, very powerful decisions about what
privacy is and what your rights are in society with 0 oversight from the
actual society. In fact, police are (in theory) beholden to politicians and
the voting public right now. If piss-poor decisions are made based on faulty
software from software conglomerate A, who do we hold accountable?

------
grenoire
This has been debated extensively on HN (and is still being debated actively),
but all I hope is that people realise sooner or later that the level of safety
you can acquire in a society is inversely related to the level of privacy
individuals can have.

What I _personally_ hope for is that people achieve a balance between safety
and privacy.

~~~
lazyvoice
How does this differ from "If you have nothing to hide you have nothing to
worry about."?

Surely it's not difficult for you to imagine fairly common scenarios where
your equation is patently untrue. A battered spouse of LEO or a non gender-
conforming individual in a theocratic society are only two really easy
examples that come to mind.

If we assume that individuals who control or have access to surveillance
mechanisms are universally magnanimous and actually capable of neutralizing
threats to safety, then your negative relationship between safety and privacy
might hold up to scrutiny. I have more than a bit of trouble seeing how that
assumption is anything other than utopian.

~~~
grenoire
I don't mean to defend surveillance as a gateway to safety by any means, in
fact I believe in the exact opposite. Unfortunately, it's going to be
integrated into our society whether we like it or not, and after that point
it's only a matter of time before we come to accept losing our privacy; having
ignored all the exceptional "casualties" and enjoying the add scrutiny it
yields to law enforcement.

~~~
lazyvoice
I suspect the integration and matter of time exist in recent history. The
question I struggle with is, what ought a neutral-good aligned individual do
about it?

~~~
grenoire
I think neutral good as an alignment inherently comes with a disadvantage in
making impact. You have to take certain levels of risks at certain extremes to
be able to influence _anything_.

~~~
lazyvoice
I'd like to be neutral good, but recognize that one may need to dip into chaos
depending on the landscape they find themselves in. :-)

------
snowwrestler
I can't read the article without an Economist account, but I think the
headline is a bit too broad. There are a lot of people who happily use facial
recognition to unlock their iPhones every day, and I haven't seen much of a
backlash against that.

Why not? Because it's isolated on the device, and can be turned off whenever
you want. I think the real concern is the _combination_ of facial recognition
with vast centralized databases. Those big centralized databases are just
begging to be exploited and abused, whether for commercial, authoritarian,
criminal, or even creepy "LOVE-INT" stalking by the people running the
databases.

I think facial recognition (and other forms of recognition) can have a bright
future in society, but only if they conform to our expectations of privacy.
Those expectations have been created over thousands of years based on how we
interact with people. The closer that computer interaction can hew to existing
social conventions, the smoother path it will have to adoption.

I don't mind if the waiter in a local restaurant recognizes me when I come in;
in fact I like it! But I don't think I would like it very much if I knew he
was calling 15 different data brokers to report on me every time he sees me.

~~~
tim333
>I can't read the article without an Economist account

try [https://outline.com/BhW2BL](https://outline.com/BhW2BL)

------
forgottenpass
>Jack Marks, who manages Panasonic’s public-safety products, called it “short-
sighted and reactive”. The technology exists, he said; “the best thing you can
do is help shape it.”

Not using it IS "shaping" it. Just not into a shape profitable for this
dishonest huckster.

------
vfc1
It's like nuclear power, technology is what we make of it.

Do we want to use it for criminal purposes only, effectively to almost make it
impossible for someone to commit a violent crime and get away with it and make
a much safer society?

I'm all for it, there are some nasty people out there, serial killers,
pedophilians, drug cartels. This technology deployed at scale would make these
crimes much harder to get away with.

As long as it's another way to ensure democratic laws are respected, I don't
think that completely forbidding it makes sense.

I'm all for regulation, we don't want this technology to be abused.

~~~
coding123
Ski mask. You might catch/convict a guy with it, but soon after a lot of
people are just going to wear them all the time.

~~~
bpyne
Yup. About half the US needs them for 3 or more months per year. And then
there's the rise of surgical masks to prevent the spread of viruses.

I also foresee a surge in wide-brimmed hats.

~~~
Loughla
I remember stories from 5 years ago or so about artists who were making facial
recognition blocking scarves. The scarves made you look like a lightbulb on
camera, so they couldn't see your face.

I chuckled, because it was ridiculous, and now I'm wondering if they make them
in professional colors. . .

------
alfiedotwtf
America is turning against facial recognition, until it's not. Total
surveillance looks like it's inevitable.

Surveillance will come slowly, then suddenly

------
linuxftw
America isn't turning against anything. There was no shortage of people that
bought the new iPhone with face scan technology.

I believe we live in a society where it's already possible to track the vast
majority of people to a general area 100% of the time. Technology is
progressing, soon it will be real-time tracking of targeted individuals 100%
of the time. That will slowly expand to all individuals. Between cell phone
tracking, license plate scanners, and facial recognition cameras, the system
knows where you are.

------
alexnewman
Actually there's a proliferation of private facial recognition software being
deployed across almost all large buildings in the united states. it's just
behind the scenes and not being used by law enforcement. This is the standard
for america, law enforcement gets barred from high tech, but private industry
is free to use it. Eventually they sell it back to law enforcement.

------
mensetmanusman
This backlash is the silver lining to China’s shortsighted implementations of
this technology to their world renowned police state.

~~~
toephu2
This may be an unpopular opinion here but on the flip side have you been to
China lately?

Street crime (including petty thievery) has dropped immensely.

Any city in China is much safer than Oakland.

~~~
themacguffinman
I don't think safety is the only thing that matters to people.

------
notus
Facial recognition is coming no matter what. There is no way people will
sacrifice the conveniences that come along with it. Sure people will be
against it when their only concept of its applications is to "catch bad guys",
but when it begins to cut time out of their day they will have a change of
heart.

------
EugeneOZ
I'm ready to sacrifice my extremely convenient way to unlock iPhone just to
know this technology is banned worldwide.

This technology opens wide doors for surveillance, for finding, identifying
and pursuing political opponents - not just their leaders, but every person
who was brave enough to protest on the streets.

It opens wide doors also for cyber bullying, and this issue is a problem for
every country, not only for dictatorships.

~~~
tobylane
What would you expect to be achieved by banning it? Intelligence services will
continue with it, which is my major concern over its current use.

~~~
walshemj
I think the concern is more widespread use by LEO and corporates who have far
less oversight - and will effect several orders of magnitude more people.

~~~
acct1771
You have vastly overestimated the oversight on intelligence agencies.

~~~
walshemj
And how much legal oversight do the FANGS have? Let alone some local US police
force run by individuals such as Sherriff Joe

------
dwyerm
Yes, but the thing is that facial recognition IS just video compression.

You take a video stream and compress it down into a timestamped stream of IDs.
It's really lossy, but it's the same as OCR or Speech-to-Text -- it is a tool
that allows us to better handle large streams of data.

As always, the tool isn't the problem. It's the use of the tool.

(But I think we all know that.)

------
this2shallPass
Police are only one group who might use facial-recognition software. Body-
worn-cameras can be used and not collect face data or use facial recognition
technology, or their legal uses can be severely restricted and clearly defined
(or outright banned, lots of options). Why conflate all these things. They
overlap sometimes, not all the time.

------
mjparrott
Too late

Photos are everywhere, the software will only get to be more accessible. You
can't put this back in the box.

------
povertyworld
Getting discouraging to see facial recognition and machine learning getting
conflated in every article's comments.

