
The Era of Automatic Facial Recognition and Surveillance Is Here - dthal
http://www.forbes.com/sites/bruceschneier/2015/09/29/the-era-of-automatic-facial-recognition-and-surveillance-is-here/
======
hackuser
A HN contest: Provide an elevator pitch, to completely non-technical end-
users, that will make them aware of this issue and persuade them that it is
important to them. Vote up the best ones. [EDIT: The goal is to persuade them
about its dangers, not sell them something based on it]

That elevator pitch is key. I have extensive knowledge about this issue but
have a hard time coming up with it; I need to be prepared.

Change only will come through political pressure, and that requires public
awareness. I had lunch recently with a journalist from a prominent
publication, someone very well informed, and they told me that widespread
surveillance wasn't real, a sort of urban myth. Organizations like the EFF
have been unsuccessful and I think that's because they don't generate mass
public awareness. It's up to us to create it, and we need a pitch.

EDIT: [deleted]

~~~
macNchz
You've spent years saving up to finally buy a home. You're having a baby in a
few months, and you really need the space. You have the down payment ready,
your credit is great, you've ticked all the boxes on the application, but
you're turned down for the loan. Why?

Maybe you walk home through a bad neighborhood. Maybe you sometimes grab a
beer at a bit of a divey bar near your office. Maybe you've visited an old
friend who's been in some trouble recently. Whatever the case may be, your
face has been in a photo with some questionable folks, whether it's a security
camera or a stranger's Facebook photo. So the bank's computers noticed, and
flagged you as a risk. People who hang around questionable people aren't very
good at repaying their loans.

~~~
shabbyrobe
I've tried this one on lay people before. The problem I have is that they
usually can't get past their firm belief that these things are technologically
impossible, even after you explain to them that they are. "That all sounds
like science fiction, that can't happen in the real world."

Also it can fall down if you don't stick only to directly relatable details:
"but I don't walk through a bad neighbourhood, so this wouldn't affect me and
therefore I don't see how it could be a problem". The instant it ceases to
speak to their experience directly, you've lost them.

~~~
imron
> "That all sounds like science fiction, that can't happen in the real world."

Tell them that the mobile phone they have in their pocket also seems like
science fiction - and it was 20-30 years ago. Now it is science fact. Facial
recognition is getting the same way.

If anything, based on all the shoddy movies that 'zoom in, enhance' with
photos, the public should be primed to accept facial recognition quite easily.

~~~
duderific
Ha, when watching those shows, I am always the nerd telling my wife how it's
impossible to zoom in and enhance, because you can't create more information
than existed when the photo was taken. I'm not sure that she gets it.

~~~
meric
If you take a blurry photo of a license plate - it can be possible to enhance
it (e.g. using sharpen) to reveal the license plate number - just because your
naked eyes can't extract that information from a blurry photo doesn't mean the
blurry photo doesn't already have enough information to encode the number
precisely.

~~~
yorak
There is also superresolution methods that in practice make the
"enhance"-feature possible. The information from the same objects in
successive frames, eg. from a security camera feed, are composited into a
single frame of higher resolution than the originals. This can even be done
for a single image (see. Glasner et al. "Super-resolution from a single
image." Computer Vision, 2009 ICCV 2009. IEEE.).

If machine learning is brought into play it gets even more interesting. See
eg. waifu2x, the anime video upscaler based on Deep Convolutional Neural
Networks. Basically it is a neural net, that has been shown A LOT of anime,
and it can therefore creatively(?) fill in the created pixels when upscaling.

I have no idea how advanced this kind of ML boosted "enhance" technology can
get in the future, and it might even get a little scary: imagine an observer
with the experience of million lives spent looking blurry images in poor
lighting. It might get surprisingly good at that...

------
forgottenpass
Also available on his blog instead of Forbes' terrible website

[https://www.schneier.com/blog/archives/2015/10/automatic_fac...](https://www.schneier.com/blog/archives/2015/10/automatic_face_.html)

------
ksenzee
As someone with prosopagnosia (neurological inability to recognize faces) I
find this incredibly frustrating. My everyday life would be immensely easier
with assistive tech that told me the names of everyone I see. And now that the
technology exists to develop such a useful device, what do we get instead?
Mass societal surveillance.

~~~
cantrevealname
That would be the killer app for Google Glass. Not just for your situation,
but for salesmen, customer service reps, security guards, politicians, or
anyone who needs to recognize lots of people, recall names, and bring up an
instant profile about the person standing in front of them.

(Granted, even this application has serious overtones of societal
surveillance.)

EDIT: Anyone care to speculate on why Google Glass doesn't do this? Surely the
processing capability is there.

~~~
ksenzee
Google Glass bans facial recognition apps[0]. I've tweeted them about getting
an exception for assistive tech but I've never pursued it any further[1].

[0]
[https://developers.google.com/glass/policies?hl=en](https://developers.google.com/glass/policies?hl=en)
[1]
[https://twitter.com/googleglass/status/463838878143492096](https://twitter.com/googleglass/status/463838878143492096)

~~~
15155
A great example of "you don't own your hardware," or "you own your hardware
with a hundred strings attached (probably to the software)"

RMS called this one.

------
iamcurious
I no longer know what to do about this problem. I'm convinced surveillance is
here to stay and that making it illegal will only serve to hide the
microphones. I'm still extremely grateful for those who fight for privacy, is
just that I don't see how we can win.

Not knowing how to treat the cause, I'm at least trying to treat the symptoms.
We might not be able to stop them from seeing our secrets, but we might very
well stop them from making us feel ashamed and from blackmailing us.

~~~
hackuser
> surveillance is here to stay and that making it illegal will only serve to
> hide the microphones.

Certainly it's here to stay, but so is murder, theft, fraud, and many other
things. We want to maximize the leverage and tools people have to fight it.

~~~
cookiecaper
Unfortunately you can't put the genie back in the bottle. I don't think
there's a real way to fight this in the age of networks and cameras in every
pocket.

There are many good uses for the ability to instantly recognize and identify
anybody from a crowd, as well as bad ones.

Probably the easiest way to frustrate these things would be the simplest; only
go out in public with substantial facial features obscured or altered.

The degree to which tools with diverse uses, some of which may be bad, should
be regulated is always a matter of serious controversy (cf. firearms). Whether
it's legal or not, this tech _will_ be used, so we need to get people to
accept that and learn to recognize and frustrate its usage, instead of trying
to deprive the populace of the tools that provide the function, which is
ultimately futile.

~~~
baobabaobab
[http://cvdazzle.com/](http://cvdazzle.com/)

I don't know how well obscuring facial features will really work. There might
just be too much information. Partially obscured face, body dimensions,
clothing, location, time... I think it might be a loosing battle.

~~~
anigbrowl
Who wants to do that every day? Also, how long before we can out-calculate it?
Right now we can already match things like gait patterns and multiple
biometrics (eg height, relative length of limbs) well enough for many purposes
using probabilistic systems (eg your metrics match 3 people but the other two
are on different continents so I'm 99% certain it's you).

------
msellout
Where's the line that separates generic image recognition from facial
recognition?

Forget corporations for a moment and think about your own rights as a
programmer. I want to tinker with image-similarity scores by downloading
random images from the internet and comparing with a photo I just took in a
public place. Oops, my photo just so happened to have a stranger's face that
matches a photo in my database and that filename contains the person's name.
Should that be illegal?

Consider the effects of a legal requirement to have an "opt-in" for facial
recognition software. That was what the "consumer advocates" thought should be
a minimum standard. Large corporations with trusted brands might be able to
run such an effort, but any startup certainly could not.

Regulating what software I can write is a slippery slope. Compare facial
recognition with encryption. If there's regulation for recognition software,
then regulation for encryption will be coming right after. The worst-case
scenario in my mind is if the federal government has a monopoly on any kind of
software. Our best defense against a government/corporate monopoly on
surveillance is not to ban it, but to democratize it.

~~~
PostOnce
What you can do in your own time is one thing, what you can do commercially is
another.

Encryption is something you do by yourself or with another consenting party,
facial recognition is something you do to someone else without their
permission and possibly use for nefarious purposes.

There's nothing wrong with outlawing certain activities from the commercial
sphere, or any sphere. It's not an assault on your freedom, its protection of
my privacy? It's illegal to stalk and harass people in person or over the
phone. It's illegal to serve people undercooked food or raw milk or whatever.
It's illegal to breach HIPAA.

Are you next going to say its terrible that you can't access my medical
records for your startup?

How about security cameras and genitalia recognition in the bathrooms at
restaurants, is that cool too?

~~~
msellout
It's illegal to stalk and harass, it's not illegal to make phone calls. The
issue is not facial recognition, it's the misuse. I'm all for gun control, not
gun ban.

Cameras in the bathroom? Fine. Whatever. If that's the new normal, then I
guess I'll get over it. Are you saying that will now be legalized? I'm pretty
sure we don't need any new laws to prevent cameras in changing rooms. The
"scary" scenarios that have been described in the other threads are already
illegal for other reasons -- harassment, redlining (racial profiling), spam,
etc.

~~~
PostOnce
You'll get over cameras in the bathroom?

You should be able to collect any data you want, and use it however you want?

Why not start a business face-recognizing people who show up at battered
women's shelters and selling access online for $50?

Or how about completely legally standing a public street and making notes of
the schedules of when people are and are not home, and selling access to that
to probable burglars, should that also be legal?

The use of information can and should be regulated. Everything from insider
trading to medical privacy.

~~~
msellout
People's opinions are remarkably malleable. Read some accounts of how soldiers
change during and after fighting. Watch how the audience acts at a rock show.
Check out the people on a nude beach vs the sidewalk just off the beach.
Normal and acceptable is entirely situational. Walking in the street was
normal. After the ad campaigns of the auto industry, we call it jaywalking.

I'm all for regulating (mis)use of information, but not the collection of
information.

------
jpatokal
So _are_ we actually there yet? I was under the impression that, while face
recognition has evolved to the point that face matching against a limited
group of people (say, your Facebook friends) is feasible, we're still a long
ways off from being able to put a face-scanner in a crowd in NYC and have it
spot the 1 suspect among 10 million without a totally ridiculous rate of false
positives.

~~~
rm999
This should be a top comment - this kind of tracking is not possible at scale
and probably won't be for a long time. Identifying a person in a group of
100s/1000s has been a solved problem for decades. Consistently identifying a
single person from a database of 100Ms/billions of people who are constantly
gaining weight, growing facial hair, etc is impossible with today's
technology; false positives will become an exponentially bigger problem as you
increase the candidate space.

Perhaps one day increased camera resolutions will allow trackers to identify
something humans can't, like retinal scans. This would allow cameras to
basically "fingerprint" people, and will be super creepy.

~~~
IanCal
I think that actually as you increase surveillance the problem gets simpler.
If you're always in view of a camera then I can simply track your motion and I
only need to recognise you once. Given that I've seen you in your car, I can
already narrow things down significantly.

> Consistently identifying a single person from a database of 100Ms/billions
> of people

I don't need to, there aren't billions of people who are likely to be at the
closest ATM to your house.

> constantly gaining weight, growing facial hair, etc

But I see you several times _every single day_.

> false positives will become an exponentially bigger problem as you increase
> the candidate space.

To be honest, I think that this is the most likely thing to scare people, that
you won't be allowed to fly because a computer made a mistake and everyone
trusts it or can't override it. That the police get called when you go to pick
up your son because you're the same height and rough build as a child molester
who was released from prison yesterday

------
cvwright
I agree with the general idea put forth by the article. This is an important
issue, and the future could bring a lot of bad things if we (as a society) are
not careful.

Schneier makes it sound pretty scary. For a general audience, this is probably
good. It takes a lot to make your average Joe start paying attention to
something new.

But on a technical level, I can't help wondering, how realistic is the
scenario he describes? Or, maybe the better question is, how _immediate_ is
the threat of universal face recognition? If we know what time horizon we're
working with, maybe we can be better prepared.

I think Schneier makes some substantial logical leaps in the article that make
it sound like automatic, ubiquitous facial recognition is a more urgent issue
than it really is.

For example, he goes from this

> Today in the US there's a massive but invisible industry that records the
> movements of cars around the country. Cameras mounted on cars and tow trucks
> capture license places along with date/time/location information ...

to this

> This could easily happen with face recognition.

with no supporting evidence for how this feat of engineering will be
accomplished. Recognizing letters and numbers on a license plate is one thing.
Recognizing faces at random angles with random occlusions, hats, eyeglasses,
makeup, etc., is quite another thing.

I found a couple of papers from CMU and UCF a couple of years ago where they
look at the problem of recognizing people in photos at "web scale" [1]. The
recall/precision rates of the algorithms they tested are pretty decent, but
they're still pretty far from the dystopian future described in the article.

[1]
[http://enriquegortiz.com/wordpress/enriquegortiz/research/fa...](http://enriquegortiz.com/wordpress/enriquegortiz/research/face-
recognition/webscale-face-recognition/)

------
joe_the_user
I'm not sure what "good as a person" means in the context of large scale
facial recognition.

I know software is now scoring as people on various standard AI tests. But
most people couldn't recognize a million other people correctly. The situation
is kind of odd. It's like how engineering a car to drive automatically in a
variety of controlled situations which might represent 90% of the normal
driving. They are still nowhere near ready to drive on real roads.

The supposed advantages are perverse; Face recognition with a 1% false
positive rate can translate one fugitive to 10 or 100 false arrests of random
people.

A store clerk knowing your name when you walk in the store is mostly creepy
and wouldn't help sales - the store that had their clerks saying goodbye by
name to credit card holds a few years back have stopped.

So, benefits? Changes? The article is kind of light on the whole subject.

~~~
hackuser
> store clerk knowing your name when you walk in the store is mostly creepy
> and wouldn't help sales

Saying your name might be creepy (but maybe people will become accustomed to
it). Knowing your identity I think would certainly help sales, the same way it
does when you shop online.

~~~
joe_the_user
_Knowing your identity I think would certainly help sales, the same way it
does when you shop online._

How?

I suppose you could have store change it's stock based on who it knows has
visited it. But that sounds cumbersome and speculative. The store might find
itself removing the things people like to buy _from that store_ and stocking
things people prefer to buy from other stores and so wind-up selling less.

Kind of like when things I've searched for on amazon reappear as Facebook ads.
It don't click through and buy 'cause I've already decided on whether or not
to get them.

~~~
hackuser
I'm surprised to find questions like this on HN. These are fundemental
concepts that drive SV; while I don't expect the world to understand I expect
it here.

------
on_
There was a great discussion I wont be able to find about this recently on HN
after an article about a zebra/leopard patterned couch. Thwarting an image id
algorithim is not difficult and the researcher came on and spoke about the
problems in the space similar to this xkcd[0] comic. It is a super complex
problem to solve. How far away are we? It is like the turing test, everytime
someone gets close the term is redefined. It is certainly coming, maybe
reasonable models in ~5 years publicly acknowledged. What can someone really
do to prevent it? Short of making an account every few months with a lot of
your data but different pictures, what american doesnt have a photo that has
been published on the internet? Even if you dont have facebook, someone you
know does. The tech will take a while to dev and refine, and longer to deploy
at scale, but I can't think of a scalable solution to this.

[0][http://xkcd.com/1425/](http://xkcd.com/1425/)

~~~
croddin
Here is the discussion for "Suddenly, a leopard print sofa appears":
[https://news.ycombinator.com/item?id=9749660](https://news.ycombinator.com/item?id=9749660)

------
gkfasdfasdf
Perhaps the niqab (face-veil) will see a rise in popularity...

~~~
tormeh
Nah, turns out that modern AI techniques are better than the brain at certain
things. In this instance, the brain has huge amounts of volume dedicated to
recognizing faces. AI don't have that same preference for faces; we can make
AI recognize using any feature, including seemingly unlikely ones such as
gait. A driverless car with no windows would probably do it, but then again
the car itself will probably be bugged a la Samsung TVs.

------
devindotcom
There was a comic recently that had a main character who had little emitters
constantly blasting his face with IR radiation, so it would just turn up as an
overexposed blob on surveillance cameras. Of course all they need is an IR
filter to combat this, but the idea is fun.

~~~
jaxb
an episode of Almost Human had 'anti-camera spray' that did this, as well.
maybe a comic was based on that?

------
electricblue
This article is a bit alarmist. I agree that government entities using this
technology as a tool of oppression is definitely bad & going to happen
somewhere but I'm not worried about the shop clerk knowing my political
affiliation because there is zero profit motive there. You're talking about a
truly massive database of customer info joined with facial info
obtained...how? Is the DMV gonna start giving away my license to Home Depot?
Is facebook going to torpedo their business by selling this info? I doubt it!
Second, why would they want it? to creep me out when I walk into their store?

~~~
hackuser
> Is the DMV gonna start giving away my license to Home Depot?

IIRC this already happens and businesses can buy DMV databases.

> Is facebook going to torpedo their business by selling this info?

Many businesses collect and sell this information. Where do all these online
businesses, who charge users nothing, get their revenue? [EDIT: More simply
said, that info arguably is the basis of the Internet economy]

> why would they want it?

To discriminate. They may offer different products or prices. They may
discriminate because you are gay or transgender, because of your ethnic
background, or because of your religious (e.g., Muslims) or political
affiliation (e.g., communist, right-wing extremist, etc.). This can happen in
more serious situations too, such as when dealing with government, law
enforcement, or in employment situations.

Certainly such discrimination has been widespread for centuries and is today.
Why would it be any different when they have new tools?

~~~
electricblue
Seems like a massive amount of effort just to be a dick. They get all the info
they could ever want from customer loyalty cards, this is a redundant and
frankly useless waste of money.

~~~
gknoy
I think it's unwise to underestimate your fellow humans' propensity to "be a
dick", as you put it, even if they don't gain anything more than the
satisfaction of treating others poorly.

Facial recognition + a database allows POWERFUL and fine grained
discrimination!

Imagine being able to know ${Things} about people -- via analytics / The
Internet -- whenever any prospective customer comes into your store.

You could refuse to do business with gay people, divorcees, adulterers,
married people, transgender people, politicians, members of the opposing
political party or religion, or even police officers with X number of
complaints. You could exclude people from the wrong side of town, or who
didn't go to the right schools, or whose parents were (or who themselves are)
illegal immigrants.

Many of these things are Probably Legal (some are likely not), but most of
them strike me as unjust treatment of others. The technology allows
frightening things, and there are MANY people who would do so right now if
given the ability. (Religious and political boundaries are one such place
where people are quite willing to do so.)

I only touched on business owners. Now imagine that roving bands of jerks (or
criminals) got their hands on the same technology or data, and could tell who
was a stranger in town, or was more vulnerable than others, or fit their
desired target criteria. It's pretty scary, and I'm sure that within ten years
we'll hear stories of such abuse happening.

~~~
AnimalMuppet
> I think it's unwise to underestimate your fellow humans' propensity to "be a
> dick"...

That tendency exists quite strongly in humans, yes. Unfortunate, but true. On
the other hand, businesses also have the tendency to want to make money. If
I've got a business, and I make it hard for group X to buy stuff from me,
they're probably going to try to find someone else who wants more strongly to
make money and/or less strongly to be a jerk. And that business will get their
money, and I won't.

And there may be businesses that are okay with that. But it ought to be
somewhat of a self-defeating behavior.

Note well: I am not claiming that this behavior is moral or legal.

The criminal aspects mentioned in your last paragraph is quite frightening,
and more so because it is also quite probable...

~~~
IanCal
> That tendency exists quite strongly in humans, yes. Unfortunate, but true.
> On the other hand, businesses also have the tendency to want to make money.
> If I've got a business, and I make it hard for group X to buy stuff from me,
> they're probably going to try to find someone else who wants more strongly
> to make money and/or less strongly to be a jerk. And that business will get
> their money, and I won't.

Which is true, although there are a few rubs:

1\. It's not so easy to start a business, and if the group you exclude are
small there's very little [edit - forgot to finish this sentence, sorry]
incentive to open a new business just to cater for a small proportion of
people. Depends on the business, obviously.

2\. You're ignoring any influence other customers have. If 90% of your
customers threaten to boycott you because you (bake cakes for gay weddings ||
print leaflets for republicans) it'd be good business sense to not serve that
small percentage of your client base.

With 2, consider a "realistic" situation. In the future, all sex offenders
will have their photos published on some public crime database. Cue outrage
that sex offenders are being served in the local bar (or choose the
establishment of your choice), they are then banned. Are people going to open
up bars with the motto "sex offenders welcome here!"? To make people more wary
of this, what happens when it thinks you're a sex offender because of a false
positive?

I'm always really concerned when people argue that the free market will remove
discrimination, and perhaps they're right in the long term but I don't think
it's unreasonable to say that there was a fair amount of time when people in
the US were discriminated against because of the colour of their skin (not to
get into a discussion about _now_ as it's not too relevant).

------
pms
It is a scary vision. However, the scary part seems to come not so much from
the increasing data availability, but rather from the increased likelihood of
discriminatory uses of such data. Enforcing non-discriminatory uses of data
sounds like a hard problem to tackle, especially if we realize how diverse is
the space of possible missuses (e.g., in law enforcements, business, and
politics) and that this problem affects both humans and computer systems. As
hard as this problem sounds, the article provides a great motivation for
research that defines computationally what "discriminatory" means, as well as
for the research aiming to design systems (both human and computer) that are
privacy-protecting and non-discriminatory.

------
gglitch
My default interpretive framework when I read something like this is, of
course, nearly hysterical paranoia and sociopolitical despair. And, speaking
as objectively as possible, I think those are not invalid responses. However,
for the moment, I'm instead trying to imagine sociology/psychology of the
first generation to grow up native to this environment, and I admit that it's
fascinating to think about. What will the teenagers be like who grow up
expecting to be identified and known everywhere they go, by friend and foe
alike? How will it affect manners, speech, courtship rituals, expectations for
entertainment/education/employment? Etc. So strange.

------
bitL
Anyone can suggest brands for high quality fake mustaches and asymmetric
sunglasses? ;-)

------
vorg
When surveillance systems are put in place and collected information stored
for later use, irrespective of whether it's a personal, business, or
government system, the people who get themselves in control of them will use
the surveillance and/or information to steal from, preach to, discredit,
and/or wear down the target/s, all of which ultimately equate to benefiting in
some way at the target's expense. There's no such thing as unused surveillance
capacity or unused collected information.

------
domrdy
Watching Minority Report the other day, this reminds me of the automated
retina scans in the movie. Really creeped me out when Tom Cruise was walking
through a mall, gets picked up by the scanners and they start shooting
targeted advertising at him.

~~~
finance-geek
Tangent: I always chuckle at how prescient Minority Report was on some
aspects, yet they didn't envision network storage or wireless communication --
do you remember at how they would transfer small glass disks all the
time...how burdensome. LOL.

------
philbarr
Walk into a store, and the salesclerks will know your name.

\- they already do once you scan your store card

The store’s cameras and computers will have figured out your identity,

\- store card

and looked you up in both their store database and a commercial marketing
database they’ve subscribed to.

\- store card again.

They’ll know your name,

\- store card

salary,

\- if you entered it and if you didn't lie when you filled out your store card
form

interests,

\- if you entered them and your didn't lie when you filled out your store card
form

what sort of sales pitches you’re most vulnerable to,

\- if the "deep data" / AI is actually correct

and how profitable a customer you are.

\- store card

Maybe they’ll have read a profile based on your tweets and know what sort of
mood you’re in.

\- MAYBE. assuming the AI is anywhere near correct.

Maybe they’ll know your political affiliation or sexual identity,

\- MAYBE, probably wrong.

both predictable by your social media activity.

\- PROVE THIS.

And they’re going to engage with you accordingly, perhaps by making sure
you’re well taken care of or possibly by trying to make you so uncomfortable
that you’ll leave.

~~~
swsieber
Considering that Target can tell whether you're pregnant or not before you do
based on your purchases, I'd assume that online habits encode your preferences
a lot more strongly that one would like to think.

More info on that weird pregnancy story:
[http://www.nytimes.com/2012/02/19/magazine/shopping-
habits.h...](http://www.nytimes.com/2012/02/19/magazine/shopping-
habits.html?pagewanted=all&_r=1)

~~~
vonmoltke
They can't tell "before you do". They can tell before you tell people, but the
woman needs to know she's pregnant before her habits change.

~~~
IanCal
That's definitely true in this story, but is this general statement true?

> the woman needs to know she's pregnant before her habits change.

There are always stories of people who don't realise they're pregnant, do
their habits change in a reasonably identifiable way?

This wasn't particularly relevant to the point, I just wondered and thought it
was an interesting question.

------
caskance
You show your face automatically everywhere you go. Expecting other people not
to look is folly.

It's no different from the idiocy around fingerprints.

------
shostack
How poignant given the end of the 3rd episode of Heroes Reborn that I just saw
last night (don't worry, I won't spoil its awesomeness).

