
Banning facial recognition is missing the point - acmegeek
https://www.schneier.com/essays/archives/2020/01/were_banning_facial_.html
======
atoav
While he is right I think Bruce is also missing the point here himself: he
states that this law is the wrong way to fight surveillance – but that is not
the stated goal of the law.

The goal of the law is to prevent the development of a technical reality in
Europe, to which a Jurisdiction can only passively react. The technical space
moves so fast at times, that reality has been made before the law can even
start to think about what is okay and what isn't.

This time they wanted to say: "Yeah it is a shiny new thing, that would
definitly come if the law stayed as it was, but it is such a hairy ball of mud
that we ban it till we figured out what is allowed." Or phrased differently:
it is so obvious for them that this is prone to abuse, they ban it first and
then figure out how to deal with it in an adequate fashion.

So Bruce' idea that the goal was to fight surveillance is a tad bit to
optimistic in my eyes..

~~~
fauigerzigerk
_> The technical space moves so fast at times, that reality has been made
before the law can even start to think about what is okay and what isn't._

In general I think this is exactly how it should work. You cannot stop all
technological developments early on and make rules for imagined problems that
may or may not arise.

Imagine it's 1995. The Web gets banned for five years so that a government
committee can meet to figure out what could possibly go wrong with that
technology.

There are always groups that stand to lose when new technologies are
introduced. And these groups are inevitably very powerful at the beginning of
any new development. You can be pretty sure that nothing would go ahead
without great resistance.

That said, we have to find a balance. Of course there are technologies that
are so dangerous that they must be regulated fairly early on. Making some
rules for face recognition and similar technologies seems entirely sensible to
me.

But as a general principle I think problems should be solved when they arise.
Solving imagined problems before they arise is impossible and prone to misuse
of power and influence.

~~~
TeMPOraL
> _Imagine it 's 1995. The Web gets banned for five years so that a government
> committee can meet to figure out what could possibly go wrong with that
> technology._

Fortunately we already know what can go wrong with facial recognition
technology. We have China, and we have American and Israeli and various Asian
marketing startups inventing abuse after abuse after abuse. So there is
already a test bed in place; it doesn't have to be the entire world.

As a general principle, it may not be a bad idea itself. Let a part of the
world play with the new toys and other part observe, after that either the
tech will be exported to holdouts (the control group, if you like), or the
regulations will be exported to the experimenters.

~~~
_carl_jung
But the context in those countries is too different to ours to come to any
meaningful conclusion about what should be allowed "here" (I'm assuming UK +
US mainly).

~~~
varjag
The vector of anglo world isn't too promising right now either.

~~~
_carl_jung
I don't remember stating otherwise.

------
jammygit
> These efforts are well intentioned, but facial recognition bans are the
> wrong way to fight against modern surveillance. Focusing on one particular
> identification method misconstrues the nature of the surveillance society
> we're in the process of building. Ubiquitous mass surveillance is
> increasingly the norm. In countries like China, a surveillance
> infrastructure is being built by the government for social control. In
> countries like the United States, it's being built by corporations in order
> to influence our buying behavior, and is incidentally used by the
> government.

He’s right, but we should take every winning battle we can, no? Even a few
notable victories could help to change public opinion about whether things are
inevitable or not

~~~
BLKNSLVR
> but we should take every winning battle we can, no?

It's treating the symptoms not the cause though, and this little victory will
likely serve to dampen resolve to continue fighting against the root cause:
increased surveillance, invasion of privacy, and the overall devaluing of
privacy.

Facial recognition may be a battle that gets won, but it could be a key moment
that results in the war on ubiquitous surveillance being lost.

Additionally, this is only a 5-year victory. Must we repeat it again every
five years? At least the victory in the war for encryption lasted 20-odd
years.

~~~
Barrin92
how do you devour a whale? One bite at a time. I don't think there is any
purpose to being caught up in theoretical discussions about privacy. Privacy
is won by extending the real rights of people piece by piece and sustaining
that effort for a long time. There is no one time, root cause fix that holds
forever in a democracy.

>Must we repeat it again every five years?

Yes we must. We live in a society where people over and over negotiate how
they want to live together, which means that if we advocate for something we
have make our case again and again. That is a good thing. In the Mie
Prefecture, Japan the citizens tear down a shrine every 20 years because it is
the only way to learn exactly how it was built.

~~~
codegladiator
> how do you devour a whale? One bite at a time.

Missed a step. You gotta kill it first.

~~~
_frkl
Not really, it'll die eventually if you've taken enough bites. Might be a bit
more difficult to take the bites if you didn't kill it beforehand, but still
...

------
randomsearch
I see his point, but practical face recognition is a lot easier to implement
and a lot harder to defend against than all the other types of recognition.
You could, for example, no carry a phone or turn it off. But your face is
expected to be on show (obv exceptions aside).

So no, it’s not missing the point. It’s a different point. He’s right, but
action against face recognition is also worthwhile.

~~~
jammygit
In Canada, it’s now illegal to hide your face during a protest (a gift from
the Harper years)

~~~
schlowmo
Unfortunately this is already illegal in some other western countries for
quite some time. In Germany for example since 1985 (one year late though).
It's often used as legitimation when the police is using violence against
otherwise peaceful protests.

In the aftermath of the G20 summit in Hamburg there were some advancements to
even tighten the law while at the same time the police was using illegal face
recognition software.

~~~
petre
What about fake beards and moustaches? I've heard they're the norm in Korea.
Imagine thousands of Santa Claus or Paul von Hindenburg protesting.

~~~
raxxorrax
Illegal surveillance technology deployed warrants illegal masks in my opinion.
But you should also hide eyes, ears and nose to the highest degree as
possible. But even then advanced image processing will probably be able to
identify you if you are already on record in some form.

------
dmje
Everyone on HN is no doubt sitting there nodding along sagely but there’s also
a pretty good bet that 90% of us (nerdy, apparently in-the-know tech types)
have done very little to think about cutting down on smartphone use. It’s the
single biggest surveillance vector (a device that you carry everywhere that
knows where you are and reports your habits into a data store we know pretty
much nothing about!), but we all now assume - hilariously, in my opinion -
that we “couldn’t live without our phones”.

Personally I think the battle is totally lost until people start actually
thinking about the negative impact of these devices, both from a surveillance
POV and a wellbeing one.

~~~
fsflover
Once Librem 5 is ready, this problem will be solved for many nerdy people
here. It's just the lack of choice now.

~~~
clay10
What about the Librem 5 would make you less trackable?

~~~
fsflover
At least the lack of built-in tracking. In addition, the (theoretical)
possibility to install any anti-tracking apps available for GNU/Linux.

------
xupybd
My fear of facial recognition is not just governments it's criminals. All they
have to do is crawl my linkedin on facebook and bam they have my name next to
my face.

Worse they know where I work or where I live.

This opens up a world of problems.

Say someone sees a stranger in a situation that the stranger may find
embarrassing. Embarrassing enough that they could be extorted. They open their
phone and find out who the person is. They can now make contact and threaten
to contact your employer or spouse with images of you in this embarrassing
situation.

You now have a real life avenue for this sort of thing
[https://en.wikipedia.org/wiki/Sextortion#Webcam_blackmail](https://en.wikipedia.org/wiki/Sextortion#Webcam_blackmail).

I know the greater problem is that we now share our details online maybe no
linkedin no facebook is smarter.

------
speedplane
We shouldn't ban any technology. However, we should ban a technology when it
is used for _____. The ban should cover the use, not the technology itself.
It's fine to make building a nuclear bomb illegal, but studying radioactive
elements should not be.

Similarly, the technologies this article describes "identification,
correlation and discrimination" should not be outright banned, but maybe they
should be when their use conflicts with other important values (e.g., due
process in criminal prosecution, privacy rights).

Recent targeting of facial recognition does not "miss the point", the reason
it has become an issue is because it has started being used in the criminal
justice system. No one really gets upset when it's just used to tag your
photos.

I know developers love abstractions, but we should not try to build
abstractions of technology and regulate those. To borrow from the article,
identifying a person based on their "gait" is not nearly as serious an issue
as facial recognition, even if abstractly they may seem similar to a
developer. Instead, focus on the direct problem at hand, regulate it, and only
after the passage of time when commonalities become clear, build an
abstraction.

~~~
mnl
Why not? Do people trust the potential uses of that technology? If they/we
don't and we all live in working democracies we have the right to ban it if we
want.

I don't trust anybody having this technology, and the same goes for having
samples of my DNA tagged with my name. BTW, I have no problem with my
country's compulsory ID, I'm not showing it all the time and it's kind of
reasonable when I have to, but having agents around me tracking what I do or
don't 24/7 isn't any kind of future I want to live in. And that's what this
technology has been developed for, it's not a science project. You're going to
be classified and put in behavioural bins, your scores will be sold and you'll
end up living in a corporate version of China. The fact that it won't be your
state's agencies the ones doing it doesn't make it right.

------
singron
Also see the discussion 4 days ago from the same essay published in the
nytimes:
[https://news.ycombinator.com/item?id=22098021](https://news.ycombinator.com/item?id=22098021)

------
cryptoz
Schneier is missing the point.

> The whole purpose of this process is for companies — and governments — to
> treat individuals differently.

This isn't true. It's about population control, not treating individuals
differently. He has an extremely close-minded view of the future here. The
point of this isn't to treat people differently, it's to contain the overton
window and basically thought-police entire populations. Nothing to do with the
individual being treated differently, that is merely the lipstick-pig moment
where it _happens_ but isn't really the core problem.

> The point is that it doesn't matter which technology is used to identify
> people.

It sure does matter! Bugs in code, hacking the system itself is all
'implementation details' of the 'technology' used to do the tracking.
Techniques that are especially susceptible to both of these make a
surveillance technology dramatically more dangerous.

The details _do_ matter, because they add up to a larger, unknown future.
Schneier has a bizarrely close-minded view of the dangers of facial
recognition.

I haven't heard any of the proponents of the ban say that it is the only
surveillance that is happening or that should be banned, but that seems a
central tenet to his point.

Some progress is better than no progress, especially in this case where things
move quickly. I do not think we are at risk of a population thinking that its
problems would be solved if we could just ban this one thing. It's a straw-man
argument that I haven't even heard elsewhere.

~~~
gwbas1c
> This isn't true. It's about population control

Uhh, why do you think that? So far, I only see evidence that mass surveillance
is to catch criminals and for marketing.

"1984" was a work of fiction, and it seems more a warning about perpetual war,
totalitarian, and the communism concept of "cult of personality" than just
surveillance.

~~~
cryptoz
I don't see how 1984 is relevant here. The surveillance we have today far
surpasses the simple ideas in that book.

> Uhh, why do you think that?

Because the people building the facial recognition pretty much say so?
"Organize the world's information and make it universally accessible" does not
sound like marketing opportunities or criminal-catching to me! It sounds like
a desire for total corporate control over the physical and digital world.

Not to mention, it is the logical end of using this technology; "power
corrupts, absolute power corrupts absolutely".

Having such a powerful and omnipotent surveillance system in place enables
totalitarian control over populations. This happens through chilling effects,
through terrorism, mental boxing-in of populations ceding their free will to
the government. Knowing that your own face is tracked everywhere you go and
sold off to anyone and owned by your own government or other governments is
_dehumanizing_. Do this for decades or centuries and you will have a very
humanized and defeated population.

> So far, I only see evidence that mass surveillance is to catch criminals and
> for marketing.

I don't see any evidence like you are describing. I feel like we're living in
different worlds.

------
tomlockwood
I think the scariest thing is that in a world of unlimited data retention - a
future powerful bad actor will have access to all our past behaviour - whether
that's liking the wrong political party, or visiting a gay bar.

------
barnabee
He's right to an extent, but face recognition is instant, silent, works from a
distance, allows near perfect identification of a specific individual in many
cases, and is _particularly onerous on the individual to evade_ compared to
other forms of tracking.

That makes it entirely reasonable to single out and ban it, whilst also
thinking about and pushing for further curbs on surveillance.

~~~
3ot
Is it though? I haven‘t read about any deployed solution, which would fit
„near perfect“ identification. I’d be thankful, if you could point me to some
„working“ solutions.

------
anigbrowl
Strange that Schneier omits the new CA privacy law, which looks to have teeth.
That said, no law will be sufficient unless it's binding on the state as well
as private actors.

------
jillesvangurp
If you accept that ubiquitous surveillance, for which this is one of several
technical enablers, is technically feasible, is being implemented by multiple
parties, and is basically happening, the only logical conclusion is that there
is going to be way more of this stuff happening in the next years/decades.
This will become normal; whether we like it or not.

IMHO trying to stop this or slow it down is an exercise in futility. It may
postpone the inevitable outcome for some short time but the outcome is
inevitably going to be multiple parties tracking our every move either openly
or covertly.

I would like to emphasize the notion that it's not just going to be a handful
of parties doing this. This stuff is rapidly becoming a commodity. Just
because you are in the US or Germany (in my case) does not mean, Russia, North
Korea, Iran, China, etc. are not tracking you (in addition to your 'friendly'
local secret service). Assume the worst; you probably are already on file in
multiple countries in some form. Also, who says it's just going to be just
nation states? Several big corporations exist now that basically have a bigger
valuation than the GDP of most countries.

The fact that the parties that are going to do the tracking are mutually
hostile (or at least not very friendly), also represents an opportunity:
they'll be watching each other. Effectively nobody is excluded from being
under surveillance; including those doing the surveillance. That means anyone
breaking laws has to worry about being observed doing so and has to assume
that he/she is going to be found out in case something inappropriate happens.
IMHO this is a good thing and effectively the only defense we'll have against
this being abused and enforcing any privacy legislation.

~~~
philpem
Mutually assured destruction?

------
zkid18
Meanwhile in China local governments even install surveillance cameras on the
top of 3900 metres mountain
[https://twitter.com/kidrulit/status/1216197264202293253?s=21](https://twitter.com/kidrulit/status/1216197264202293253?s=21)

Other than ethical problems cameras cause urban planing and aestheticcal
problems.

------
squarefoot
A few years from now, cameras and vision software will be advanced enough to
recognize a person just by analyzing the small differences in colors of the
clothes, how s/he walks or mapping micro scratches on his/her car without any
need to take pictures of either the plate or the face. I'm also pretty sure we
could already build a model of a walking person or animal just by having it
walk or run on a weight sensors equipped mat, so that after due training
recording when someone walks in or out of a place would be doable without
cameras.

The ban should be on the final purpose, that is, pervasive generic
surveillance. Otherwise it keeps being a moving target in which the most
powerful party is constantly one or more steps ahead.

------
mirimir
I see that these bans regulate use by police.

But how could any of them affect use by federal agencies? Which would then
share information with local police.

I mean, police lie to courts about StingRay use, or data from the NSA via
SOD,[0,1] and do parallel construction.

I really do think that privacy in meatspace is hopeless.

0) [https://www.reuters.com/article/us-dea-sod-
idUSBRE97409R2013...](https://www.reuters.com/article/us-dea-sod-
idUSBRE97409R20130805)

1) [https://www.deamuseum.org/wp-
content/uploads/2015/08/042215-...](https://www.deamuseum.org/wp-
content/uploads/2015/08/042215-DEAMuseum-LectureSeries-MLS-SOD-transcript.pdf)

~~~
miek
I agree with you, and I'll add that they will simply (continue to) use
companies like this Clearview AI:

[https://www.nytimes.com/2020/01/18/technology/clearview-
priv...](https://www.nytimes.com/2020/01/18/technology/clearview-privacy-
facial-recognition.html)

------
Lind5
While most of these systems work well enough to identify a person, there are a
number of well-known ways to defeat them. One is simply to apply newer
technology to cracking algorithms used inside these devices. Improvements in
processing power from one generation to the next, and a proliferation of
information about where the vulnerabilities are, applies to biometrics as well
as other technologies.How Secure Is Your Face?
[https://semiengineering.com/how-secure-is-your-
face/](https://semiengineering.com/how-secure-is-your-face/)

------
BurningFrog
Some times a technology is so cheap and easy to use that banning it becomes
absurd, and you just have to accept the new reality.

Face recognition isn't quite there yet, but in 5 or 10 years every kid with a
phone can do this.

~~~
raxxorrax
Technology is never inevitable and I don't put much faith in people that
cannot shape their future.

If there is political will, it is pretty easy to ban public cameras for
example.

The hardware is already mass produced and as cheap as it gets, but not
everyone became as paranoid as Londoners.

~~~
BurningFrog
Would you ban cell phone cameras, cameras in cars (mine came with 3 cameras)
and Ring cameras?

They're all used in public.

~~~
raxxorrax
No, I would ban cameras for surveillance purposes. This is pretty easy to
legislate and the basic rule. There would be some exceptions to this
naturally.

------
bsenftner
Schneier's points are dead on. One of the few talking sense here. Biometric
and related identification technologies are here and multiplying. We need to
regulate data sharing and 3rd party data compiling, regardless of the data.
Dirty data and incorrect data will haunt people, existing in who knows what
databases. This is what needs to be regulated. Not the individual
identification technologies. Stop the abuse of new tracking technologies
before they are invented. As well as end the advertiser tracking that is out
of control. (FYI: I write FR software.)

~~~
aeternum
Data compiling is just too easy at this point. Anyone can spin up some
instances and run webscrapers to compile profile pictures, writing samples,
etc. Many data leaks are available as torrents or hosted on pay-for-access
sites likely from a variety of jursidictions.

Even with some kind of global cooperation, can regulations really solve the
problem? They might prevent corporations from using the data for advertising,
but I don't see them as actually addressing the privacy issue.

~~~
bsenftner
The regulations would prevent random scraped data from being compiled into
official profiles used for official purposes, such as credit checks, security
scans, and anything that can legally and official be used as verified
information about you. Regulations need to be in place to prevent junk
profiles compiled from dirty sources and then used as official information.

------
zkid18
What is your view on the decentralization of facial recognition?

I believe that surveillance cameras can help to decrease physical violence,
but the main concern I have that the algorithm and embeddings databases are
out of mine control even for validation.

Hence, are there any authorities that publicly disclose the FAR metrics? So
far I came around UK case only [https://bigbrotherwatch.org.uk/all-
campaigns/face-off-campai...](https://bigbrotherwatch.org.uk/all-
campaigns/face-off-campaign/)

------
crazygringo
> _Finally, we need better rules about when and how it is permissible for
> companies to discriminate. Discrimination based on protected characteristics
> like race and gender is already illegal..._

This is they key point of the article, but unfortunately he doesn't give any
solutions or even hint at them. Does anyone know if he does elsewhere?

It also feels a bit disingenuous for him to use the word "discriminate" here.

"Discrimination" is generally understood to give someone an _unfairly_
negative experience due to inherently _irrelevant_ factors, such as denying
someone a job due to the color of their skin.

I think most people would agree that "discrimination" is the wrong word to use
when showing different people different ads due to their browsing history, or
giving different people different credit card offers based on their credit
scores, or charging young people more for car insurance based on their age.
All of these are based on what people generally consider to be non-
discriminatory, _evidence_ -based distinctions -- and so words like
"targeting" or "market segmentation" are more appropriate.

If he wants to argue for a better framework for what is considered legitimate
targeting/segmentation or not, I'd love to hear it. Otherwise there's not
really much to say?

------
devy
Schneier's title made it seem like we should not be banning facial
recognition, but in the blog itself he spent lengthy verbiage arguing there
are 3 components we needs do more to fight against of a surveillance state. In
other words, we need to do more to ban/regulate those rather than undo the
facial recognition ban.

------
Ohn0
Interesting he mentions festivals banning it. There's a startup in LA who's
trying to add facial recognition to ticket gates of large venues like stadiums
for both security and marketing. I wonder how 50,000 people think about being
facially recognized / tracked just to go see the ball game or concert?

------
bawana
proves we are in hell. Some geek ape figured out to use a rock as a tool-the
rest of the apes started throwing rocks at each other and just took what the
other apes made. Some geeks figure out nuclear power and a way to provide
endless free energy- the rest of us turn it into a weapon. Some other geeks
make a foolproof way to communicate for free-the rest of us weaponize that
too. Cant wait for skynet to clean out this crap. Our brains just cannot
handle complexity.

------
petermcneeley
It took me awhile to grok but I think that people like Schneier want to take
the ideology of online anonymity and apply it IRL.

~~~
avmich
What is the ideology of online anonymity?

~~~
blazespin
I think there is some expectation of privacy online. No such expectation
exists in a public space.

~~~
Enginerrrd
>No such expectation exists in a public space.

It sure as hell does in any practical sense. I expect that I can walk around
town, and people that I don't know, don't know my name, phone number, purchase
history, etc.

Let's say I want to buy a butt plug or something. I could go to the sex shop
in town, and make sure nobody I knew was around before I went in, buy it and
leave. If I wanted to buy a studded electro-shock butt plug, and wanted to
take extra precautions that I was undiscovered, I could drive to another town
and drop it off.

Now suppose I'm a government whistle blower trying to expose some really nasty
government coverup or conspiracy committed by the FBI. I'd definitely take the
same precaution. I could leave my phone, go several cities over, walk up to a
public mail drop and drop off my envelope of documentation in the USPS box and
historically, could fully expect to remain anonymous in doing so. That's still
arguably safer than sending an email and hoping I've perfectly configured all
my network VPN or tor settings to try to ensure there aren't any leaks.

That's NOT the case with facial recognition. Particularly when that info can
be processed and categorized into a form where someone could just query an
identity and get a complete historical tracking of a bunch of places you've
been prior to any sort of investigative interest.

~~~
kenjackson
You walk in and your neighbor is working the cash register. Now he tells
everyone you know.

All you’ve done is try to obtain privacy, but you can’t walk into the shop and
demand that your neighbor shouldn’t be there.

The “expectation of privacy” is really a hope that you’ve marked your identity
well enough, which is really the same thing you do online.

~~~
Enginerrrd
>You walk in and your neighbor is working the cash register.

This is a total fantasy.

1\. If I was concerned about it, I probably would have noticed before entering
that he was there.

2\. If I knew him, I'd probably know where he worked and could avoid him.

3\. If I didn't catch myself in time, I could walk in and buy something
mundane, or nothing at all, or a gift card.

4\. If I was really concerned, I'd go the next town over.

5\. There's a massive difference between taking a very low probability risk of
being identified multiplied by further low probabilities of it being spread
notable, remembered, and spread around, and a 100% chance of being
categorically identified and logged by an automated system.

------
candiodari
Bruce is missing the point too. He does see that the problem with facial
recognition is the backend database with data about individuals. The way to
index this database is not the problem, and that's where facial recognition
comes in.

Facial recognition used _exclusively_ to access my hotel room? Fine! Even
fingerprint. No problem. As long at that data _is not linked to other
databases_ , and is erased, at minimum when I request it. And, most
importantly it needs to be protected against cross-referencing to government
databases.

Because that's where the real problem lies. Cross-referencing. Is a store
allowed to remember data about me? Sure. That's what store clerks do. The
employees at the post office don't ask for my name anymore, they use facial
recognition (the wetware kind) and then go look for the packages with my name
on it. Great!

I go to a psychiatrist and he proposes that if he diagnoses me with something
I can get all the visits paid back. Ok, whatcha got? Well, autism seems
somewhat justifiable and is very popular at the moment. Okay. Now this data
gets passed to the government in my medical file, cross-referenced to my
insurance, passed to them, and now I can't renew my car insurance. There's
special cover for that, more expensive, of course. Even worse: it got cross-
referenced in the government itself to, and I now have to get approval from a
psychiatrist to get my driver's licence renewed, every time.

Okay, so I contact the psychiatrist, and this cannot be removed from my
medical file ("because then I could sue medical professionals and they
wouldn't be able to defend themselves using the data they have"). Okay, fine,
YOU can keep your notes on me if you must, but I want it out of my government
medical file. Nope, that system just doesn't support that. We can add some
additional explanation if you want, but that's all.

So I feel like the needed laws are: 1) Any medical data is off-limits for
cross-referencing of any kind with no exceptions. It is also off-limits for
government and cannot be used for traffic, tax, ... purposes. Even law
enforcement should not be able to see this data under any circumstances. If
such data is needed or important in a case, a judge can call my doctor to
testify, to answer specific questions, and that's the absolute limit of
government access. 2) Any data you record on me you need to specify what it
will be cross-referenced with, for companies, but ESPECIALLY if you are the
government. There must not be any consequences for saying "no". And when
asking permissions, only explicitly enumerated named companies/departments and
databases with clear listing of what that data is used for and nothing outside
of that. 3) I want the ability to withdraw that permission at any time, which
means ANY system that it was cross-referenced in must delete that reference 4)
I want the ability to delete any data about me that was passed on AFTER THE
FACT, ESPECIALLY in government databases, even if I initially didn't tell them
not to pass it on. 5) I want something like Google's privacy dashboard, but
for the entire government. Ideally also including companies' data. Which has
buttons to delete this data that actually work.

If you follow these rules, feel free to use facial recognition, fingerprints,
heartbeats, ... to index the data you do have on me. Not a problem. I can
always demand you delete your data and start from scratch though.

------
radu_floricica
Because the conversation is so lopsided, I feel like somebody has to ask: what
opportunities are we giving away when banning face recognition tech?

Obvious one would be no more purse snatchers, ever. Almost no rapes, much
fewer violent crimes. No lost kids. Lives and wallets saved.

What else? This feels like the kind of tech that creates its own broad fields
of use where there were none before. It might take time or imagination to see
where it goes.

And if - just saying - if the benefits are good enough to stop and think for a
second, aren't there any better ways of skinning this cat than banning tech?
Might be worth a brainstorm.

I'm not a big fan of EU's GDPR - I've literally just fired a client last week
on an emailing consulting gig because the threat of unpredictable huge fines
is too big for me. Plus I freaking hate cookie popups. But I have to admit
it's also doing a lot of good - if companies can't store my IP address without
my consent, I'm pretty sure they can't store my facial profile and daily
movements. Which leaves the choice in the hands of the consumer, where it
should be. I have Google Location enabled on my phone - it's creepy accurate,
but I like it. Maybe you don't. Let's not chose once for everybody.

~~~
moksly
No street art. No protesting. Social credit. No way to break away from the
stigmas assigned you your face. No speeding. No jaywalking. No pissing
outside. No way to rebel.

I mean, it wouldn’t really stop crime. London has something close to total
surveillance, and they have a lot of crime.

But if you really want that sort of thing, you can actually have a taste of it
in parts of China.

~~~
quotemstr
This comment reinforces the OP's point: opponents of facial recognition seem
to approach the world with a particular value system that includes a
disagreement with several existing laws. Instead of working to get these laws
changed, facial recognition opponents advocate bans on technology that would
maken these laws easier to enforce. Why should they succeed?

> No street art

Good. Graffiti is a blight.

> No protesting

There's protesting and there's "protesting". Lawful protest will be
unaffected. Unlawful protest involving criminal behavior will become much more
difficult. Good outcome.

> No way to break away from the stigmas assigned you your face

In an age of background checks, your face is the least of your worries. Are
you opposed to background checks?

> No speeding

Speed cameras already exist and don't rely on facial recognition to issue
tickets

> No jaywalking > No pissing outside > No way to rebel

Why should people get to break the law?

~~~
mbel
> Why should people get to break the law?

Because laws are made by people and as such can be flawed.

~~~
quotemstr
Laws can also be unmade by people. What gives random individuals the right to
nullify laws made by everyone? Don't like a law? Get it repealed. In the
meantime, you have to follow it.

~~~
avmich
> In the meantime, you have to follow it.

You also would like to expect that if the law is unjust, it can't manage to
make too much harm while getting repealed.

And repeals can take a long time. If people would easy organize for things
like repeal a law or a government which suddenly become despotic according to
majority opinion, we'd live in a different world. What we're trying to make
here, banning face recognition, is to make mistakes in laws less costly -
sometimes way less costly.

