
Boston bans use of facial recognition technology - psychanarch
https://www.wbur.org/news/2020/06/23/boston-facial-recognition-ban
======
kbos87
So much attention is always paid to accuracy when facial recognition comes up.
Even if it was 100% accurate, it’s a technology that makes mass surveillance
too efficient. Prohibitions need to also extend to the private sector, with
the exception of, say, facial recognition for personal use (eg FaceID.)

~~~
throwaway0a5e
>So much attention is always paid to accuracy when facial recognition comes
up.

Because many people are ok with mass-survielince as long as long as it's used
fairly. The problem they see with this tech isn't that it's going to create
the kind of world we don't want to live in but that it unfairly targets
minorities and/or the poors more than other people. They don't understand that
it can never be fair because the institutions that run it can never be fair
because there will always be out-groups because that's how human nature works
and they will get disproportionately screwed unless we limit the ability of
the majority to screw them.

~~~
mc32
Historically speaking, when the world was small towns and rural,
“surveillance” was a fact of life. Everyone knew someone who knew someone
else.

The big difference is that surveillance was localized and didn’t follow the
person.

If someone did a crime or did something against local mores and got ostracized
they could skip town and settle somewhere else to begin anew. Now, of course
being the new person in a new place you were under scrutiny, but as long as
you followed local practices your old peccadilloes or even crimes didn’t
follow you.

~~~
brlewis
> The big difference is that surveillance was localized and didn’t follow the
> person

That is _a_ big difference. But _the_ big difference is how one-sided the new
surveillance is. In a small town, they knew everything about you, but you also
knew everything about them.

Today, parking regulations in my town are enforced by cars that circle around
scanning every license plate parked on every street. I could probably find out
who has access to that data if it were a priority for me. But what's stopping
a private entity from doing the same thing? I'd never know it happened. Cell
carriers selling my location data? I could easily have never known about this.
Ad tracking companies having a record of most sites I visit? I don't even know
who they are other than a few major ones.

~~~
xapata
Great point. What if all the data were available for everyone to access? Would
it feel ok then?

(Not sarcasm)

~~~
samfriedman
Even if "all the data" were accessible by all citizens, there would still be a
massive asymmetry present in the government's ability to process that data
into useful information.

~~~
shadowgovt
Government and corporation. The FAANG companies clearly have enough processing
power to churn an entire nation's big data streams.

Although, if one's goal were a town or even an individual city, I bet enough
infrastructure to sift the data is buildable or rentable by individuals or
small groups of individuals.

It becomes an interesting world when non-profits can afford big data
resources.

~~~
xapata
I think that's our best option for avoiding oppression. Case in point, video
of George Floyd. If we had widespread public (by the public) surveillance, I
think people would feel safer. If someone is "disappeared," hopefully there'd
be evidence that the ACLU could pursue.

~~~
shadowgovt
The "extraordinary rendition" program the CIA was using to disappear terror
suspects into places they could be tortured was discovered in part by
airplane-spotting hobbyists. Because planes are extremely hard to hide, and
there are people who watch airports to see what takes off and lands for fun.

When they pooled their data, they were the first group to notice the military
had started running flights to and from locations they didn't normally fly,
and it didn't take much investigative journalism after that to discover those
planes were carrying people.

~~~
fuvkthisguy
Do you have any sources for people who are interested in reading more on that?

~~~
xapata
Extraordinary rendition is a good search phrase.

------
bryanrasmussen
I would really like to see one of those movies now where they identify the
unknown guy by photo as a big time terrorist only this time it turns out he's
a janitor in the local high school, then it can be one of those feel-good cop
- unlikely partner action-comedy flicks, but only halfway through, the first
part will be trying to catch the terrorist who keeps getting away from them by
doing janitorial work at unexpected junctures.

~~~
deeblering4
Beverly Hills Mop 2

~~~
bryanrasmussen
the janitor could be played by Ryan Reynolds, when they find out he is a
janitor they don't believe it "you're too attractive" then he gets upset
because being too attractive for being a janitor has kept him from getting
promoted all these years.

Cop played by Samuel Jackson

the terrorist played by Ryan Gosling, "these guys don't even look alike"

response Samuel Jackson: they're both too good looking to be janitors.

Ryan Reynolds: why does everyone keep saying that!

later in the movie - Samuel Jackson - I have had it up to here with all these
good looking white guys f _ing up the m_ f*ing facial recognition!

Idris Elba should be in this too. somehow.

on edit: maybe at the end Idris Elba is brought in as another person the
facial recognition identified as the terrorist.

everyone is "how is this possible!"

Ryan Gosling: I don't know, he's pretty good looking, I'm kinda flattered.

Ryan Reynolds: yeah, the computer thinks I look like this guy, wow maybe I am
too attractive to be a janitor!

The ask Idris Elba what he does for a living -

custodial technician.

------
jp_sc
“We really have a tendency (...) to let our technology go ahead of our common
sense about how we want to live together"

Bravo!

~~~
acomjean
Sounds a little Amish. I ended up in Amish country when traveling last
century, and it was a little weird. But I went to a house museum, and it was
interesting.

One of the things that stuck with me as different: They take a little time to
look at a new technology a decide if they want to use it. I wonder if our
embrace of tech as being "neutral" is correct sometimes.

"They're more cautious — more suspicious — wondering is this going to be
helpful or is it going to be detrimental? Is it going to bolster our life
together, as a community, or is it going to somehow tear it down?""

[https://www.npr.org/sections/alltechconsidered/2013/09/02/21...](https://www.npr.org/sections/alltechconsidered/2013/09/02/217287028/amish-
community-not-anti-technology-just-more-thoughful)

Of course predicting where things end up when new tech disrupts is difficult.

~~~
strgcmc
Tangent here, but after visiting the Pennsylvania Dutch area a few years ago,
I had this mini-epiphany of a vision of an alternate universe where most of
society lived in small communities like the Amish do, that were mostly self-
reliant and farm-based with cottage industries, BUT in which the communities
embraced technology.

Basically, with 3D printers and solar power and local grids and machine shops
to fabricate for local needs, and a loose mesh network to connect these
decentralized nodes/communities together (like Mastodon, instead of the paper-
and-print Amish newspapers that exist). A kind of techno-libertarian-Amish
blend, to form a society that was more resilient and modular, rather than
hyper-centralized, dense, and easily swayed by viral influence/behavior/trends
or literal biological viruses.

EDIT: Of course it would never "scale" since such a system probably could not
support higher population levels at all, but maybe that's okay in the ultra-
long-run or some post-apocalyptic rebuilding future (also dangerously veering
into population control and genocide-ish topics).

~~~
the_pwner224
> Basically, with 3D printers and solar power and local grids and machine
> shops to fabricate for local needs

None of those could exist without the significant science and R&D which is
only possible as a product of people leaving their personal areas to
congregate in cities and colleges where information is shared and built upon.
Along with a lot of money coming from the government and big companies. And
the issue with that is when people leave for that purpose, many of them tend
to leave forever to stay there.

~~~
strgcmc
There's an activation threshold absolutely, which requires large-scale central
investment to get there for the very first time. But like, if you had one Star
Trek style replicator, and it was able to make more replicators, then once
you've crossed that threshold, you don't necessarily need all that massive R&D
infrastructure as much.

It's also kind of a difference in strategic approach, to pool resources and
create cutting edge institutions, or to adopt a more decentralized approach
where yes, you might not get as quick of a pace of innovation, but each
community is more self-sufficient. So, you might not ever get a fancy cure for
cancer or CRISPR technology in this alternate universe, but most communities
might have their own local clinics and more local nurses and maybe better
overall health outcomes by focusing on common treatable problems, rather than
pushing for cutting edge innovation. Similarly, you might have fewer PhD's,
but more high school graduates or bachelor's level of education.

Anyways, nothing about this is all that realistic or anything, just some idle
world-building in my head.

------
RcouF1uZ4gsC
There is however a huge loophole

From the ban: [https://assets.documentcloud.org/documents/6956465/Boston-
Ci...](https://assets.documentcloud.org/documents/6956465/Boston-City-Council-
face-surveillance-ban.pdf)

"Nothing in (b)(1) shall prohibit Boston or any Boston official from:

a. using evidence relating to the investigation of a specific crime that may
have been generated from a face surveillance system, so long as such evidence
was not generated by or at the the request of Boston or any Boston official;"

So if a third-party, say the FBI or DEA, provides info from face surveilance
systems to Boston with them specifically requesting it, they could use that.

~~~
lvs
Or a State agency.

~~~
bsenftner
Or a private firm that sets up to perform FR as a "public service" while
collecting valuable person traffic marketing data.

Realize people, FR enables the physical world to be overlaid with web-site-
like tracking ability. This is hugely valuable to business, and if they figure
that out, they will push FR just like they pushed evasive tracking advertising
all over the web.

~~~
notwhereyouare
Wouldn't that private firm fall under a 3rd party?

From the article: The city council unanimously voted on Wednesday to ban the
use of the technology and prohibit any city official from obtaining facial
surveillance by asking for it through third parties. The measure will now go
to Mayor Marty Walsh with a veto-proof majority. Walsh's office said he would
review the ban.

~~~
bsenftner
What if they "don't ask for it" but is made available?

------
kgin
It’s not just about the percentage of false positives, it’s about how much
easier it is to generate false positives.

Human face-matching accuracy is worse than software in many scenarios (“is
this the man that robbed you?”) but it requires so much effort that the
absolute number of false positives are low.

On the other hand, facial recognition hooked up to cctv can passively generate
mountains of matches all day long for pennies.

~~~
criddell
This must just be a ban of the technology for government use, right? Can
retailers still use facial recognition as part of their in-store analytics?
Can Apple still sell FaceID products in Boston?

------
zatel
I think I fall with the HN majority in my privacy views. Yesterday however I
talked to someone who said that they prefer someone is always watching, so
that they can feel safer.

Interestingly they also said they don't want to know the specifics of anyone
watching.

I wonder if laws like this, that in actuality seem fairly toothless, will
result in more of that. "Safety, and ignorance of where/who the watchers are."

~~~
e8e73ieurj
Despite good intentions and my own discomfort, I can't help but feel like the
anti-surveilance movement is mostly an extension of privilege politics and
virtue signaling. Police brutality is a real problem, but a wildly common
observation of life in the hood/ghetto/LI-housing is the prevalence of crime.
You could make neighborhoods a lot safer with a lot fewer police by using
modern methods like facial recognition cameras and unmanned aerial
surveillance. Break-ins and robberies suddenly become wildly easy to punish
afterwards and a lot of investigative work like tracking gang members to get a
sense of their operations morphs into a trivial affair. We're rapidly
approaching the point where basic physical crime is optional and while there
should obviously be oversight and moral considerations at every step I can't
help but feel it's a bit entitled of me to live in an okay neighborhood (some
crime but its mostly kids drinking in parks and hobo drama) while telling
people that the risks are too great to use this kind of tech in any
circumstance.

~~~
ch4s3
Places like Baltimore's East/West sides and South Chicago already have arial
surveillance, shot spotter, street corner camera, nearly limitless police
power, and CommStat operations. These things have been in place for at least a
decade and don't seem to be moving the needle.Sure, they could go full PRC,
but I doubt anyone has the appetite for that level of draconian surveillance.

In my experience living in one of these places, police mostly don't
investigate crime even when there's clear video.

~~~
Siira
Then the problem is with the police, not the tech, right? Safety is a really
big privilege.

~~~
ch4s3
My point is that the tech is of questionable quality, the application is
horrible, and people hate it all the same. Anything that actually "work" would
be so oppressive as to be untenable in the American context. Also, no one who
is already policed in the US trusts the institution of policing in America to
not abuse their most basic rights.

------
an_opabinia
If the basis of banning facial recognition technology is its poor accuracy,
will facial recognition technology be unbanned if it is 99% accurate for
everyone?

~~~
badrabbit
Evem 99.999% is not good enough be enough because you will have at least one
person that is guaranteed a false arrest and prosecution (which 90%+
prosecuted take a plea deal). There is inaccuracy with other methods as well
but you have humans being held accountable. When it comes to justice,mistakes
are tolerable so long as adequate compensatiom exists but when the mistake is
systemic it becomes intolerable due to the preexisting guarantee of a mistake
as opposed to a specific human making an error as a matter of chance. This is
all exacerbated by other systemic cruelties of the US justice system where
even an arrest and release for no cause means days if not weeks of
imprisonment and if charged most people accept a plea bargain deal regardless
of actual guilt. It's better to let actual guilty criminals get away than
explicitly and systemically accept even one innocent person being punished
incorrectly, because among other reasons, the justice system has legitimacy
because its goal is to administer justice, accepting any amount of injustice
invalidates that legitimacy and authority.

A good analogy would be a chef tolerating fecal matter in their food. Yes,
there is always some small chance of that happening, but no one accepts food
from s chef that explicitly treats toilet water and claims 99% of the fecal
matter is gone and only one in 100 people will get sick from it.

~~~
bryanrasmussen
99.999% is better than most fingerprints - maybe all of them really
[https://www.ncjrs.gov/pdffiles1/nij/grants/249890.pdf](https://www.ncjrs.gov/pdffiles1/nij/grants/249890.pdf)
[https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3093498/](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3093498/)

~~~
cool_dude85
I'm willing to come out against anyone developing a fingerprint gun that
allows them to take fingerprints from everyone attending a mass gathering,
too.

------
Jonanin
Presumably facial recognition technology is only banned for use by city
officials, but the title and article are unclear. I would hope FaceID is fine.

~~~
mweibel
it's in the second paragraph of the article:

    
    
        The city council unanimously voted on Wednesday to ban
        the use of the technology and prohibit any city 
        official from obtaining facial surveillance 
        by asking for it through third parties.

~~~
bsenftner
My my, the city council of Boston has a brain. When San Francisco banned FR
they left the door open for private contractors, which the city just hired
straight away.

~~~
brlewis
It's possible that Boston learned from SF's mistake.

------
stx
Let me start off by saying that I think we need to be careful with this type
of tech. Assuming it never makes mistakes can be deadly.

As someone who has done some work with building deep learning models what is
it that makes this unfairly target minorities?

Is it that the people who trained the model did not present enough example
images of minorities during training? Is it because darker (presumably black)
skin does not show up as well on poor quality videos (presumably because the
metering of the camera exposed for the surrounding background which was
bright)? Or is it the law enforcement using it was poorly trained and assumed
the computer was infallible combined with possible prejudice they already had
against minorities?

The first problem I would think could be easily solved. The second problem I
would think would be rather difficult. The last would require extensive
training but I am sure we humans would screw that up also.

~~~
elliekelly
I think you’re starting from the wrong place in your analysis. The first
question we should be asking is why we would _want_ this technology at all.
The potential for bias is a moot point if we as a society decide that we don’t
want this kind of government surveillance.

Even if the systems were perfectly fair and not the least bit biased and were
operated by a perfect utopian police force I _still_ wouldn’t want facial
recognition. I’ve yet to hear a potential benefit of this sort of software
that would justify the huge cost to citizen privacy.

Just because we _can_ train computers to recognize faces doesn’t mean that we
_should_.

~~~
pbhjpbhj
>I’ve yet to hear a potential benefit of this sort of software that would
justify the huge cost to citizen privacy. //

It makes it easy to find suspects and narrow down suspect lists. Meaning far
fewer police are needed to catch a greater proportion of known criminals.

Most people consider that a huge benefit.

Let's say you have a db of all faces in country of 60M people. You have a
photo/video of a person committing a crime, robbery. False positive rate is
1:100,000. Your search returns 600 people; address match finds 60 with
connections to the locality; 5 of those have records, one for robbery. You'd
at least sit a person down for an hour to review the matches, consider the
records, list people for interview.

According to UK ONS stats, those adults released from prison, in Jan-Mar 2018,
had a reoffending rate of 65%.

It seems just tracking known offenders would find the perpetrator in many
cases if visual recognition is possible.

I mean, this is _the_ principle benefit.

~~~
elliekelly
I fully understand and appreciate that supposed benefit but I don’t think that
justifies the privacy cost.

------
mncharity
For those unfamiliar with Boston governance, "Boston" here means the "City of
Boston", population 0.7 Mppl. A Seattle or El Paso. Rather than a "Greater
Boston" aggregation of municipalities of 2 to 8 Mppl.

For a NYC analogy, imagine its historical consolidation was more limited, and
many of its towns and cities remain independent, never having consolidated
into boroughs, and the boroughs into one big city. Flushing, Brooklyn Heights,
Kingsbridge, are still independent towns. Here, the city council of a city
occupying only lower Manhattan, but confusingly named "City of New York", just
voted on face recognition.

~~~
Sandvich
Somerville, Brookline, and Cambridge have similar bans in place as well. That
covers about 250k more people, for what it's worth.

------
AbrahamParangi
Surveillance should be banned instead.

------
shadowgovt
Interesting. So the city's tying its own hands. I assume private companies can
still use their own resources to do their own individual identification,
though.

~~~
x87678r
Yeah fingers crossed I can get our own private building to get facial
recognition set up.

------
throwawaysea
The stated complaint against the article is accuracy. It also says Boston PD
doesn’t use it yet - this is preemptive. I wonder if they are aware that tests
conducted by the ACLU and the like didn’t use the recommended configurations
for precision. Not to mention that false positives don’t matter as long as
there’s a human in the loop to validate the match, because then it is no worse
than the minor risk of a false match we accept even without facial
recognition.

~~~
mola
False positive matter. Being arrested for something you didn't do is horrific
and life changing. The human in the loop is human. We unconsciouly tend to
trust machines as objective and accurate. This is not as simple as you make it
to be.

~~~
throwawaysea
We already trust human police to make matches using their eyes to apprehend
suspects. This is fundamentally necessary to enforce laws and ensure a safe
society. Since a human match is required with or without facial recognition, a
false positive from an algorithm doesn’t make the problem any worse.

~~~
mulmen
Yeah but here’s the problem, you massively increased one of the humans
capabilities but none of the others. Like judgement or even perception.

With their superhuman search capability one person with a bias can
discriminate against everyone in the database instead of just whoever is
standing in front of them.

------
EmilioMartinez
"Technology should improve so we can slide into a dystopia in harmony"

------
99_00
Is the American justice system so broken and law enforcement so incompetent
that a computer flagging someone is basically an automatic conviction and the
technology has to be banned?

No evidence, by its self should be enough to get a conviction. Not your finger
prints all over the scene of a crime, not a video of someone who looks like
you, not even your DNA matching a rape kit.

The fact that someone was arrested because of match shows a failure in basic
criminal investigation more than anything else.

~~~
wpdev_63
It's because it can track the where abouts of everyone. It's a massive privacy
issue. Do you want the gov't to know where you go, what you spend your money,
and who you associate with? I sure as hell don't even if they wouldn't find
problem with it.

It really doesn't matter what the legislation says - no doubt they have
already been doing and been calling it something else. It's going to be one of
the worst things to come out of the 21st century.

~~~
Animats
_It 's because it can track the whereabouts of everyone. It's a massive
privacy issue. Do you want the gov't to know where you go, what you spend your
money, and who you associate with?_

The government already knows that, via Google, Facebook, and the telcos.

~~~
lovelyviking
Therefore those data collection activities should be prohibited completely for
those companies or anyone else.

------
Pinegulf
>"Until this technology is 100%, I'm not interested in it," he said.

Umm... I'd like to see technology, which passes this test.

Relevant obligatory xkcd: [https://xkcd.com/2030/](https://xkcd.com/2030/)

~~~
mola
Maybe that's his point? That he assumes this is unusuable unless he has
extreme indication otherwise. Also, he said _this_ technology, not any
technology.

------
x87678r
Seems dumb, facial recognition has a lot of potential. I love entering UK with
facial recognition and passport to go straight through the border, no lines no
talking.

~~~
eranima
Giving up liberty for convenience sounds really dumb.

~~~
x87678r
Give up liberty lol. Basically many places already have facial recognition,
just humans doing it.

