
Face-recognition technology is the new norm - ryan_j_naughton
https://www.nytimes.com/2019/10/15/opinion/facial-recognition-police.html
======
mLuby
Let's say you're arrested (so fingerprinted, DNA-swabbed, mug-shotted, etc)
but then not charged or convicted.

Is there a way to get those records expunged?

(IMO the innocent—and maybe those who've served their sentence—ought to be
able to do this.)

~~~
einpoklum
> Is there a way to get those records expunged?

Really?

"Yes sir, we have entirely expunged your data and you have the US government's
solemn promise that we didn't make a copy of the files we're not telling you
about."

Effectively, there is no such thing as expunging digital data. There's always
probably some extra copy somewhere.

~~~
mLuby
Sure, don't trust them but still make it legally clear what's allowed and
isn't instead of this grey area we have today. Then they'd have to do parallel
construction because they can't admit there's a still secret biometrics
database out there. And secrets love to be whistle-blown about so that'll
happen eventually and reset their progress.

There's no silver bullet, just a long battle of attrition.

~~~
rjf72
How would that reset their progress? The leaks have revealed that our secret
organizations are pretty open (at least among themselves) about parallel
construction. For instance this article [1] reveals some slides of the more
harmless surveillance data on a target. Nonetheless, in a giant disclaimer at
the top the NSA reminds their agents "This information is provided for
intelligence purposes in an effort to develop potential leads. It cannot be
used in affidavits, court proceedings, or subpoenas, or for other legal or
judicial purposes."

And as the leaks have fallen off the news cycle, along with an interesting
rise of misinformation in social media, you gradually see people beginning to
understate what was revealed and to show ever less concern over it. It's
actually a phenomenal demonstration for how a secret agency should handle a
leak to avoid any meaningful consequences of what is ostensibly ground-
breaking information on programs that are, at best, in legally grey areas, all
being released to the public en masse. Snowden no doubt felt that his leaks
would change our approach to intelligence if not our entire nation. In
reality, next to nothing changed.

[1] - [https://theintercept.com/2016/08/14/nsa-gcsb-prism-
surveilla...](https://theintercept.com/2016/08/14/nsa-gcsb-prism-surveillance-
fullman-fiji/)

------
dawg-
I was flying to Germany from Atlanta earlier this year. I enter the
international terminal, start waiting in the security line. As I get to the
front there is this little screen with a camera on top and a TSA logo on it.
There's a sign on it that says something to the effect that it must register
my biometric profile, to prevent terrorism, I guess.

I think there was an option not to consent and still make it through security,
but the instructions seemed to be purposefully unclear. If they make it
unclear, they know that 99.9% of people will not bother to ask questions. They
know how stressful airport security is in the first place and how people just
want to shut up and make it through the line and not miss their flight, and
they use that as an advantage. I consider myself to be more informed and
concerned about digital privacy issues than a vast majority of people, just
like most of you who will be reading this comment.

Even so, in the moment I gave in and let the stupid thing scan my face. And
now my face is somewhere in a server at the Pentagon probably. The social
engineering around it made it very hard to figure out whether or not there was
any real risk of resisting it or asking questions. I was with my wife and we
didn't want to miss our flight.

I applaud the person who in the same situation would have turned around and
walked out of the airport and skipped their expensive, non-refundable trip to
Europe. But when my big moment came to be a heroic activist for digital
privacy rights, I wasn't the same person I am when complaining about Facebook
and Google on the internet. Oh well, I guess travel is how we learn about
ourselves, right?

And that's exactly why facial recognition is going to become completely
ubiquitous without much fuss at all. The risks are too abstract for most
people to really care in the moment, even if they know the issue fairly well.
Real social movements don't gain momentum until people's lives are _directly_
impacted - and the issues are just too nuanced, and the whole infrastructure
of surveillance too byzantine, for people to really want to take action.

~~~
MrMember
It was the same with opting out of the backscatter scanners when they started
rolling them out. They had signs up saying you could opt out and instead
receive an "enhanced" pat down, but TSA would make a big deal about it. They'd
shout "We have an opt out!", pull you aside, and you'd have to wait for
someone to be available to pat you down (which was itself an unpleasant
experience). The vast majority of people don't want to go through that and
just accept the scanner.

These days I don't even know if it's still possible to opt out. I haven't seen
a sign saying you can in ages.

------
xupybd
I don't like this tech in the hands of the state but I'm more worried about
criminal use of this.

I think we're going to see a massive increase in extortion as people are able
to collect large amounts of data on people based on facial recognition. Sure
now it's expensive and slow. But it's going to get cheaper and quicker.

I'd recommend trying to avoid anything that can link your face to your
identity being public and online. Sextortion is a big thing at the moment. Who
knows what will be next when facial recognition becomes available to the same
scammers.

~~~
SN76477
I believe that a heavily survived society is doomed to insanity.

The constant pressure to look good, feel good, make sure information doesnt
get out will become a massive mental crisis.

We can already see the edges of this with Facebook envy and other similar
mental health issues on the rise.

~~~
jcims
>survived

typo?

~~~
no_one839083
Probably “surveilled”

~~~
SN76477
Yea, thanks... too late to fix it.

------
jedimastert
Question: do we know if police are using "find this face in a crowd" software
or "log all faces and compare them to all people". The former is the same as
was happening before only faster.

~~~
whateveracct
Sometimes, "what was happening before but faster" alone is what can cause
something to cross the line of legality (philosophically.)

~~~
jedimastert
> Sometimes, "what was happening before but faster" alone is what can cause
> something to cross the line of legality (philosophically.)

Not that I don't believe you, but I can't think of any examples off of the top
of my head. Can you give me one?

~~~
curryst
One that comes to mind is the Supreme Court enforced ban on creating a
firearms registry. Firearms dealers are required to retain sales records with
serial numbers for some amount of time (I don't have the number handy, it's
several years) so technically the data exists currently. However they have
ruled that it would be illegal to centralize that data for easier access.

Another is those automated speeding cameras. In several states they are unable
to write tickets without a police officer present. My understanding is that
this means the machine operates the same way and still writes tickets, the
officer just has to be there as a "witness" and play phone games or whatever
while the machine works. The end effect is the same, but the law requires an
officer to be a witness to the crime in order to generate tickets.

~~~
Animats
_One that comes to mind is the Supreme Court enforced ban on creating a
firearms registry._

No, that's not the Supreme Court. That's the Firearms Owner Protection Act of
1986.[1]

[1]
[https://en.wikipedia.org/wiki/Firearm_Owners_Protection_Act](https://en.wikipedia.org/wiki/Firearm_Owners_Protection_Act)

~~~
curryst
I stand corrected, thanks!

------
lasagnaphil
Data-driven classification systems has made automatic encoding of technical
images into symbolic form possible and in great efficiency. This will bring a
fundamental, unprecedented change in society we have never experienced before.
People need to start looking more at the political and philosophical aspects
of images to understand what data-driven AI system does and can be used in
nefarious ways.

If you want a more in-depth analysis of how classification systems performed
on people can have wild consequences, and that it cannot be fixed by just
tweaking the algorithms or methodologies, Excavating AI is a great article on
it ([https://www.excavating.ai/](https://www.excavating.ai/))

------
_iyig
I disagree with the premise that facial recognition technology is inherently
evil, and that there should be a blanket ban on its use by law enforcement.
For example:

Imagine a system where facial recognition technology supplements, but does not
replace, manual recognition. Isn’t it better to guide mugshot lookups with the
best available technology rather than wasting hundreds of hours on manual
searching? Without treating automatic results as final and authoritative,
can’t they still be used to drastically hasten manual look-ups?

~~~
mjevans
When most people hear "mug shots", they think of photos taking during
incarceration: a database of photos from when someone has been convicted of a
(implied serious) crime and is being jailed for it. (trivial crimes usually
having a monetary and/or community service sentence)

The issue at hand is about those who should be presumed to be innocent. The
many who are in a crowd, the majority that are probably not at all related to
whatever search is happening.

I've read science fiction where the opposite extreme happens. Where there's an
AI that implicitly interfaces with everyone at all times, but it isn't a
surveillance nor advertising state. If anyone does actually do a crime there's
an actual sentient witness, no need to track down video, and that witness is
also able to get help on the way right away. Further there's no backlog of
unprocessed data, no file of all of the various normal day to day things that
really aren't important.

I could go for a future with benevolent partner AI(s) and no actual
surveillance state even though there are cameras and microphones everywhere. I
can also see some public spaces (schools, courtrooms, other places people are
required to mingle) having observation and storage for a limited (days/weeks)
time for manual review in the case of reported incidents.

However the power as it currently stands is much more likely to be used for
leveraged persecution and partial enforcement against specific people, and
classes of people (IE not rich), rather than for all the people.

~~~
_iyig
>However the power as it currently stands is much more likely to be used for
leveraged persecution and partial enforcement against specific people, and
classes of people (IE not rich), rather than for all the people.

I don't see how this is any more or less true of facial recognition than other
evidence-gathering technology.

>The issue at hand is about those who should be presumed to be innocent. The
many who are in a crowd, the majority that are probably not at all related to
whatever search is happening.

Maybe so, but the laws which are getting passed and proposed as a result of
all this journalism are much more broad:

[https://www.sfchronicle.com/bayarea/article/Oakland-bans-
use...](https://www.sfchronicle.com/bayarea/article/Oakland-bans-use-of-
facial-recognition-14101253.php)

[https://www.theverge.com/2019/8/19/20812032/bernie-
sanders-f...](https://www.theverge.com/2019/8/19/20812032/bernie-sanders-
facial-recognition-police-ban-surveillance-reform)

------
pvaldes
And finally we found an utility to wasp stings, trolling this Sauron's eye
system for fun and profit

On the other hand for some people addicted to lip implants the system will
need a perpetual and constant fine-tuning and upgrading (that will divert
resources just to adjust the eyelid position and nose shape). I don't see
plastic cirugy being forbidden in a near future if we take in mind the number
of rich clients that use it.

------
DanielBMarkham
With most all of these privacy and anonymity stories, we're told of the types
of data collected, then the type of group taking the data, then the reason the
data is collected. Finally, there's some promise of the strict guidelines on
how it is planned to be used.

The tech community needs to start calling bullshit on these kinds of stories.

Once data is collected, it's collected. It, for all intents and purposes,
exists forever. So it's impossible for the group taking the data to promise
what the data might or might not be used for. And that's assuming that the
group is the only one who ever owns the data. In a world of ultra-high speed
internet, that's almost preposterous. It's not _if_ , it's _when_ the data
gets out. Once it gets out, it's everywhere.

While I understand that traditional property fits into a format like this,
data does not. Trying to pretend that it does? In my opinion it's doing a
great disservice to the readers.

------
umvi
> Our privacy, our right to anonymity in public

Who says these are rights? I agree free speech is a right, but who (legally)
grants me the right to privacy and anonymity in public? Does me recognizing
someone in public violate their right to anonymity?

~~~
einpoklum
Privacy is a recognized legal right, anonymity in public isn't. But government
spying on people in public is forbidden in many/most world states.

Specifically in the US, amendment IV to the constitution says:

"The right of the people to be secure in their persons, houses, papers, and
effects, against unreasonable searches and seizures, shall not be violated"

Key phrase: "Unreasonable searches".

~~~
jsgo
That's not my understanding of it. I can't be frisked, house searched, be
forced to hand over documents or other items for review without a very
compelling reason is my understanding.

A government worker watching me walk from my car to the grocery store, come
out and load groceries into the car, let out an expletive and walk back into
the grocery store to buy the bread that was forgotten, and then head back to
my car isn't something I'm protected against. Not sure the value in it, but
there's nothing preventing them from doing so.

Similarly, if someone made a public post on Facebook or some other social
media site threatening someone at the city council, facial recognition picks
them up at some government building and they linked the facial recognition to
name which linked back to their social media and that post showed up, I don't
think they'd be in the wrong for then searching them. Constitutionally. Does
it feel weird? Sure, but everything there is public and at that point it
creates the basis for a reasonable search so searching them for weapons and/or
charging them with communicating a threat seems fair now.

Now if we're talking about someone pre-emptively scouring a person's cloud
storage and finding something worth investigating, that'd violate it in my
book.

~~~
detaro
There's at least a reasonable argument that the scale plays a role. It could
be somewhat similar to
[https://en.wikipedia.org/wiki/United_States_v._Jones#Justice...](https://en.wikipedia.org/wiki/United_States_v._Jones#Justice_Alito)
\- the expectation of privacy is violated because digital tools allow a larger
breadth of surveillance that crosses a line.

------
Uhuhreally
this is probably flawed logic but how are things today different from say 4000
years ago in a city where the ruler might employ local people known to have a
good memory for faces to provide their services ?

~~~
uoaei
1\. People were (are) not able to remember more than maybe 5000 faces.

2\. Each person back then would know their data is being collected, and
exactly which kinds, because they are present and engaged when it happens.

3\. We can sort of, kind of trust humans and sort of, kind of predict their
decisions because we're not _that_ different. We are nowhere near that stage
yet with computers. We also know that putting that sort of information into
people's hands, with the economy we have today, incentivizes misuse.

------
mike_hock
> Congress must declare a national moratorium on the use of face-recognition
> technology until legal restrictions limiting its use and scope can be
> developed. > America’s future is closer to a Chinese-style surveillance
> state than we’d like to think.

Uh, yeah, whatever. I mean, it's not like I disagree or anything.

But it's not like surveillance is only just now becoming a problem, right? Go
and roll back the surveillance state revealed by Snowden that would have been
a wet dream of the USSR. Roll that back to before Bush II and put a moratorium
on that shit.

America's PRESENT. IS. A Chinese-style surveillance state. China is a
perpetual preview of what will follow in the so-called "free" world in 3-5
years.

~~~
umvi
> America's PRESENT. IS. A Chinese-style surveillance state.

Maybe in cities, but my rural town just barely got stop light cameras, so I
highly doubt this is true of rural America.

Plus, use of the surveillance is more reactive than proactive. In China it is
proactive. Drive around in a car with a taiwanese flag bumper sticker and the
secret police will disappear you in the night. In America no one will bother
looking at the footage unless a bank robbery happened and your car was the
getaway car.

~~~
z3c0
I think the "no one is looking" argument is valid, but can be damaging in the
long run. The same line of reasoning was applied to the Snowden Leaks by those
who wanted to spin it as a "nothingburger". The reason nobody is looking is
because it requires a lot of money to build the systems that can derive facts
from data while "no one is looking". As advances continue to be made in ML,
then the visual data most people are currently taking for granted will become
more easily parsed systematically. A system that pre-processes and stores all
the recorded locations of a given list of faces is competely possible by
today's standards. A database of mugshots and a continuous stream of CCTV data
is all you'd really need to get started.

~~~
leggomylibro
"no one is looking" quickly becomes "no one _was_ looking" when an individual
or group becomes problematic in the eyes of an authority. Recording everything
allows a government's executive branch to selectively and successfully
prosecute more or less anyone at will, which defeats the purpose of having an
independent judiciary.

Ideally the legislative process would step in and curb back the excessively
restrictive laws and surveillance, but it don't seem to have much incentive to
do so. Plus, legislators are subject to the same onerous regulations which
everyone breaks all of the time because "no one is looking".

------
raxxorrax
Just sign our newsletter and we tell you how you are tracked... I cannot
really condemn this attempt.

