Hacker News new | comments | show | ask | jobs | submit login
Facial recognition system risks 'chilling effect' on freedoms, rights groups say (theguardian.com)
106 points by ricardoreis 10 days ago | hide | past | web | favorite | 54 comments





Imagine this capability existed in the US before miscegenation, or same-sex intercourse, or cannabis use, or concealed carry, were legal/broadly accepted. It could be used to harass non-criminal supporters of these measures(caught a SSM proponent jaywalking!) and prosecute those engaged in the target criminalized activity, preventing movements to legalize these activities from ever gaining momentum. Social mores would become ossified. Perfect prosecution is not desirable in an imperfect world.

> It could be used to harass non-criminal supporters of these measures(caught a SSM proponent jaywalking!) and prosecute those engaged in the target criminalized activity, preventing movements to legalize these activities from ever gaining momentum

That's why cannabis was criminalized in the first place: target hippies and PoC, get their leaders silenced.


It was actually pushed by big-pharma to keep the more lucrative opioid based patented medicines going versus the CBT, THC based equivalents. Look at the new products introduced by Canadian pharma for more details.

In the meantime, pens and phones are used to harass non-criminal supporters of all kinds of measures and people, (including some USA president). I propose we forbid pens and phones!

Sounds great! And us, the public, will have access to this as well.....right? Can we watch a live-map of police and other state-employed persons with a positive facial match as they move about? No? Why not? Well that's how we feel too.

especially politicians.

This stuff is a ratchet. You can win a battle here and there, maybe slowing the progress, but the long term trend is one directional.

The way privacy norms and abstract rules about limiting power come about is *not" through gradualism. It's through hurdled events. The american revolution. The 1989 revolutions. Particular points in time where you can say "Stazi bad. Never again."

The reality is that these chouces involve compromising security. Slightly freer, and slightly less safe is not a winning political argument.

Imagine a system where detectives can plug it back a name or a face and retrieve location information from that person's phone, city-wide CCTV video, etc.

Creeoy, but more crimes will be solved.

Gradualism favours security over freedom in the long term. There are 19 rapists walking free in our city today, because limits on police power.... There's only one way this argument goes.


I really don't like the fact that any argument or concern perceived to be outside a commonly-held value system is usually shut-down by some meta-argument about the strategy being wrong, without even a chance of being evaluated on its merits. I mean, if you're right and no one will bother to listen to these concerns then you shouldn't have to say anything at all, just let it be ignored. If you have a specific better way of communicating these concerns then do that. But the discussion about any given issue often ends up being primarily about how the issue will be perceived, and everyone's perception of the subject is then colored by what commentators think people will think about it, leaving no room for actually discussing the issue honestly and trusting that people are generally smart enough to take in both sides and make their own decision, not because one side had better strategy, but because they heard someone actually discussing the issue and not some meta-concern about how the issue should be discussed. Yes I realize the irony in this comment.

> If you're right and no one will bother to listen to these concerns then you shouldn't have to say anything at all, just let it be ignored. If you have a specific better way of communicating these concerns then do that.

I agree with your general concern, but I'm not sure I accept this part.

At the moment, a lot of smart, tech-savvy people channel a lot of energy into fighting this sort of thing directly. They look for weaknesses in the systems, analyze potential abuses, go to town hall meetings and Senate hearings. If that's ultimately a doomed delaying tactic, it doesn't mean the entire problem is unsolvable - but it also doesn't mean the people who think it's hopeless have a better answer on hand.

I think there's room to say "fighting this directly isn't going to work, so let's use the time we're buying to look for a better answer". It could perhaps be said with more positivity or focus on the 'elsewhere', but the lack of a proposal doesn't mean the criticism is invalid.

(Of course, we should also directly ask if the pessimism is warranted. The crypto wars never ended, but for all the fatalism over that issue we're in a far better position today than "RSA is a munition, install this Clipper chip". I worry that the common axis is not "activism wins" but "containment fails", which argues crypto and surveillance will both spread. But I think it's an open question.)


> Slightly freer, and slightly less safe is not a winning political argument.

It should be when the reality, despite the publicity of occasional terrorist incidents (9/11 included), is "slightly freer and slightly less safe against an event so vanishingly unlikely you should simply forget about it. You're hundreds of times more likely to die in a car accident or falling off a ladder. We should not let the terrorists win by ruining our way of life and associated freedoms".

Surely that's not that hard a message to sell when it has been a winning message for many years in the past?

No, it's the current fashion. I expect more of politicians. Of course in today's world I am regularly disappointed.


You can argue it's not worth paying the price to catch more murderers, to protect more children from pedophiles, to stop more rapes. But that's a subjective value judgement and society doesn't have to agree with you.

Be disappointed all you want, but try actually taking their position seriously first.


> but try actually taking their position seriously first.

That's a mighty big presumption. Upon what grounds do you say I am not?


> I expect more of politicians.

You might want to update this prior.


>Slightly freer, and slightly less safe is not a winning political argument.

True, however "slightly less prone to abuse" is a very strong argument, especially if you don't really trust the people who hold the power at the moment.


The sad thing is that it probably won't help much against identity crime anyways. Criminals will figure out how to get around it, but ordinary folk won't.

The security industry turns your data into there money by reminding you that’s it’s protecting you from criminals turning your data into there money.

Marijuana used to be illegal for safety concerns. Now thanks for the ability of people to break incorrect laws it's slowly becoming more and more legal.

No, thanks to ballot measures in various places and voters supporting them, it's becoming legal.

Would the ballot measures have existed if every single person every trying marijuana would have been arrested the next hour and put in prison for ~5 years?

Maybe, but I imagine the progress would have been significantly slower (than it already has been).


Interestingly, I've seen several police officers or ex-officers raise this argument against always-on body cameras.

In one sense, it's exactly what always-on cameras are meant to do: prevent police from ignoring the law when no one is watching. The goal is to limit brutality and baseless charges, but the mechanism works equally well against neglected charges and officer discretion.

The places that moved first on legal marijuana were the same ones that started with détente; I think there's some real substance to the idea that bans on harmless things erode because we see that nothing bad happens when they go unenforced. (For that matter, it's the same idea as people leaving cults when they break the rules in private and discover "hey, nothing bad happened!") Universal enforcement takes away a natural experiment on the results of non-enforcement.


Good point. The long history of breaking the silly law has likely softenned up the public perception of it.

Thank goodness the founding of America was done through ballot measures and voters support of the succession of the colonies.

Oh wait sorry my bad. It was done by illegal bloody revolt against repressive laws.


You make this sound like an inevitability, but you discount the radical idea of rights and due process evolved over hundreds of years specifically to claw back investigative power. The King used to be able to search anything he wanted because it was all "his" anyways.

In the medieval days of villages, everybody knew everyone's face. There was little in the way of privacy.

Today we expect a certain level of anonymity, but mostly just because we've grown up with it. It's not been normal through history. It may not even be healthy (to live in a society alone and unknown).

I don't think its the end of the world. It won't even be the end of privacy. Social privacy is mostly pretending in public not to know what you shouldn't know. Nobody 'hears' what goes on behind the bathroom door in their home, because its the height of rudeness to mention it.

We'll have to come up with social norms to not know what everybody is doing every place in public. Not knowing who went to what private clubs, or who visited whom's house after dark when their spouse wasn't home and so on. Unless we saw it in person of course!


Those villages were within each persons Dunbar number and had a unified notion of shared interest and tribal identity. I think the argument that we never had privacy or that it is unnatural needs to be put to rest.

Automated facial recognition is explicitly for one interest group to monitor an "other." In most cases, it is a state and its party interests monitoring subjects, or one class monitoring another. It's a weapon and there is nothing benevolent or benign about it.


Confused; that argument seems to be endorsing the idea that there was little privacy, then denying it?

And agreed no doubt misused facial recognition will be an issue. Just not a personal-relationship one; it'll be a government one.


Argument distinguishes whether you're being identified by an "us," or a "them," and in the medieval small town case, it is an "us," where at scale, it is necessarily a "them."

When you add urbanization and modern identity groups, you are going to be a "them," to someone, and providing them with a technological weapon begs to have the balance corrected.


In case A, a village member knows a private fact about you.

In case B, an arbitrary person or collection of people across the globe know private facts about you.

The hypothetical individual in case A has _drastically_ different incentives regarding disclosure or use of that information. I wouldn't even call it superficially similar.

The modern example would be that your roommate knows fact X, vs. some larger group, or the entire world, knows fact X.

The former can, but not necessarily will, lead to the latter.


Hm, disagree. Socially, its only important if someone you have to deal with, knows something about you that they shouldn't. That the set 'people you have to deal with' is larger now is true, but maybe not critical.

Larger groups don't act the same way as smaller ones.

If your sister dies tomorrow, you'd probably be quite upset.

If I die tomorrow, you wouldn't care even if you found out.

Consider that your roommate, parent, village member, whatever, finds out that you're gay. It's a small group. They have the information to know whether that getting out would be good or bad for you, whether you'd like that information to be disclosed.

On a global scale that discretion just doesn't exist at all. Facebook doesn't _care_. A random worker who sees that information doesn't _care_. The information can make its way to someone who _does_ matter to you (or at any later point suddenly you may matter to them).


You seem to be arguing from the position that people in these villages were good to each other, and that bullying wasn't a thing. You clearly never lived in a small town.

Not at all.

> They have the information to know whether that getting out would be good or bad for you, whether you'd like that information to be disclosed.

Nowhere did I state that they will necessarily act benevolently. The fact is that they have a choice.

At a large scale that decision making process doesn't exist. Your actions become 'data'.

You can move away from bad roommates or out of your village. You can't move out of the world.


Speaking as a lead developer of a leading FR system, in the majority of cases, the FR user is a private organization with large public traffic: department stores, malls, schools, universities, stadiums, and yes: municipalities (cities). The majority of the purpose clients use FR: 1) commercial users such as stores and malls have serious problems with organized theft; 2) schools, universities, stadiums and municipalities use them for public safety, the people in their galleries are people who have made actual threats and active fugitives; The use of FR by police is very infrequent, as they simply (sad to say) don't seem to have access to technical competence to actually use the systems. And then there is the glaring fact that when deployed, the majority of FR systems they go unused simply because understanding how to use the FR system requires more time than anyone there has. Some "integrator" installs with a weak understanding of their purpose, and then it collects dust.

"Active fugitives." Here, the way drivers find out about adminiatrative license suspensions are police ALPR cameras on patrol cars alert cops to pull people over and charge them, causing them to pay for a tow, among other consequences.

Presumably these "fugitives" include parents with kids at the stadium who are behind on support payments or have outstanding parking fines, or perhaps something else that will initiate a high risk police encounter that will at best disadvantage them in a background check?

I've worked in privacy policy in government and the people building FR techs are equipping monkeys with guns.

Sure, someone else will do it, but at least weapons manufacturers have rules about who they can sell to.


To a degree, I agree it is monkeys being given guns. Guns they can't load, don't know where the trigger is, and are pointed at blank walls. I see great hesitation from all private/corporate users to allow any access to their FR galleries or system by outsiders. Partially it is to cover that they don't know how to use it, using it "never seems to work" (because they don't bother to read the manual, view the videos, or really anything. It's shocking.)

I think we'll get regulation of gallery sharing, which will effectively halt the ability for total surveillance.

Everyone assumes this tech is much better than it actually is. For media, treating it like an end to freedom is good for their click-revenues. But in actuality, FR is a kinda weak sensor that needs to be backed with additional identifying technologies to be practical. Defeating FR is childs' play. Even the very best...


While pre-moderns lacked anonymity, they had a very real distinction between private and public. For example, John Owen, chaplain to Oliver Cromwell and one of the most influential Oxford academics of his day, lost 11 children some of whom were in adulthood. Yet despite his voluminous writings, there is absolutely no mention of these personal events.

And while one individual isn't enough to prove a point, it is illustrative of the pre-modern distinction between private and public life. One may even suggest our lack of privacy is the result of our general willingness to erode the boundaries of public and private.


This is a completely false, irresponsible analogy.

A villager knowing a fellow's face is not analogous to a facial recognition surveillance system. A distributed group of people who know each others' faces do not communicate perfectly. Meaning one person can only know a small bit of a recognized individual's activity. In order to construct a full(-ish) picture, cooperation amongst the community (of peers) needs to occur.

Compare this with a camera network covering an entire city. The community's consent is not needed. The effort required to examine a person's activity is trivial. The dynamic and implications are completely different.


> In the medieval days of villages, everybody knew everyone's face. There was little in the way of privacy.

Yeah. And they were horrible places for anyone who didn't perfectly fit into them. The smallest deviation could lead to your death and many people had to hide their true self behind something they projected for others to see.


A big part of it for me isn't that there's a facial recognition system or license plate scanner in place, it's that the records from those systems will live forever and be used for who knows what in the future. It's a bit of a silly example but in William Gibson's newest book there is an AI system the police use that goes through decades of old records of all kinds to find things about people. That's what I'm afraid of, having to trust future police/agents with the data.

> There are also fears about the level of access given to private corporations and the legislation’s loose wording, which could allow it to be used for purposes other than related to terrorism or serious crime.

Truth is, it will be used for purposes other than related to terrorism or serious crime. It’s alarming how democratic countries are invading the privacy of their own citizens, creating mass surveillance machinery that could become uncontrollably detrimental to the very institutions and structures that they claim to value and protect. Among the FIVE EYES countries, Australia seems to be doing a lot more (comparatively) on mass surveillance and having the legislative backing for it.

> The database will be accessible to federal, state and territory governments through a central hub connecting the various photographic identity databases.

This is how it starts, with vague claims of safeguards that belie the actual access and how they can (and will be) misused.

It seems like future generations will have to give up freedom to get some sense of safety (as painted by those in power).


>>Truth is, it will be used for purposes other than related to terrorism or serious crime. It’s alarming how democratic countries are invading the privacy of their own citizens, creating mass surveillance machinery that could become uncontrollably detrimental to the very institutions and structures that they claim to value and protect. Among the FIVE EYES countries, Australia seems to be doing a lot more (comparatively) on mass surveillance and having the legislative backing for it.

It's easy to show an example scenario that almost always causes a system to like this to abuse privacy. Take the system, FR in this case, and insert the use case "Politician XYZ's 3-year old daughter has been abducted, and the FR system we promised not to use for this purpose could identify their captor!". Works every time. Once you do it that first time, the seal is broken for future abuse.


That's true. It doesn't even have to be a politician's child. "What about the children?" [1] works quite well to rile people up to surrender their own freedoms and of the future generations.

[1]: https://en.wikipedia.org/wiki/Think_of_the_children


If you want to experience the 'chilling effect' of facial recognition first hand, go to a meaningful political protest in the UK. These are different to the circus-and-bread protests, e.g. complaining about Trump in some collective Pinch and Judy show.

With the more useful protest events numbers are smaller and those that attend aren't there just to post selfies of themselves looking good protesting at the bogeyman-du-jour on Instagram. Protesting against the arms trade or fracking will be getting yourself proper attention from the authorities. The police have the one tactic called 'the kettle' where they form a police line around all the protesters, then letting them out, one by one, getting everyone's photo. From then on you know that you will be in some special rogue's gallery, to be denied being able to work for the three letter agencies and other places in the civil service due to being a known troublemaker. They don't need to take your postcode or DNA, just that waltz past the camera will do. As you can imagine this has a 'chilling effect' and curbs one' enthusiasm for wanting to participate in the direct democracy that is so encouraged by our governments just so long as it is part of the 'Arab Spring' or other such nonsense.

Facial recognition for cows on the farm, that is a thing and also has applications for other areas, e.g. nature conservation, so there is nothing wrong with it for other species, putting them under mass surveillance and not needing the physical tags.


> The police have the one tactic called 'the kettle' where they form a police line around all the protesters, then letting them out, one by one, getting everyone's photo

Why don't they just ask for an ID? At least that's what police does in Russia. If you don't have an ID - they can detain you for several hours, if you have an ID - they can detain you for something else ("we need to check your documents, please go with us").


carrying ID is not required in the UK. The police can ask for proof for identity and address under some circumstances (e.g. they need a reason more than just "you were protesting" and I think generally this would take time and paperwork, including the police giving a receipt to the person. Much easier and cleaner to take a photo.

Optics.

> "just that waltz past the camera will do"

Indeed.

Mission Impossible: “Gait recognition can’t be fooled by simply limping, walking with splayed feet or hunching over, because we’re analyzing all the features of an entire body” ...so in practice, even with our faces hidden.

https://www.apnews.com/bf75dd1c26c947b7826d270a16e2658a


A face is like a cookie, so if there are laws against cookies, there should be laws against facial recognition.

Instead they will make laws against facial obfuscation.

In case anyone wonders how likely that sort of law is, France actually have a law against covering your face in public areas: https://en.m.wikipedia.org/wiki/French_ban_on_face_covering

While I dislike the ban on e.g. religious garments, it can still be coherent to ban face covering and ban face recording.

I think this is probably only the tip of the iceberg as well - facial recognition is one thing, although it can be difficult if your face is obscured somehow. The real fun will begin when they can start doing things like gait recognition + other features for person re-id in real-time across CCTV networks. Fortunately I think we're some way off doing that reliably and with a reasonable cost.

Or tracing dna evidence instantly (those skin tags we leave on every surface or wafting in the air as we pass).

Further evidence which suggests privacy will soon be a luxury reserved only for distinguished individuals.



Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: