
EFF and Eight Other Privacy Orgs Back Out of NTIA Face Recognition Talks - jdp23
https://www.eff.org/deeplinks/2015/06/eff-and-eight-other-privacy-organizations-back-out-ntia-face-recognition-multi
======
ChuckMcM
Hmm, an alternative strategy might be to start an EFF program to photograph
and add to a facial recognition database every law enforcement officer. Have
volunteers take pictures of people showing up at the police academy, and
people in uniform doing their job, and people going into and out of employee
only entrances to law enforcement facilities. They can make this data base for
DefCon 'spot the fed' sessions and for other interested parties.

That will certainly keep the conversation going although in a less civilized
way. Perhaps it could bring enough pressure to bare on NTIA to get them to
reconsider basic privacy safeguards.

~~~
coleca
I believe the EFF and the other consumer groups are grandstanding to get some
publicity in this case. I have listened to almost all of these sessions that
the NTIA facilitated, and the company I work for even presented at one of them
last year. I think it's important to get some points out which I haven't seen
in any of the other comments:

1\. The process led by the NTIA, unlike the bulk of the EFF's post, has
nothing to do with the government's collection and use of facial recognition
data. This process was designed to come up with a voluntary guideline that
commercial facial recognition companies could choose to adhere to or not. I
agree that limits need to be set for government uses, but this process from
the beginning was never about that. The NTIA did not have jurisdiction for
anything involving the federal government, law enforcement, military, etc.
This is more akin to the voluntary privacy standards that companies on the web
will agree to on their own. Once they do, if they violate their own policy
they are subject to sanction by the FTC. The guidelines coming out of this
process were intended to be voluntary, so companies would not have to comply
if they didn't want to. However, the idea was that the working group would
come up with something that the industry would volunteer for and by doing so
give some level of protection to consumers.

2\. The industry representation was not opposed to opt-in on some scenarios,
but they were opposed to mandatory positive opt-in across the board without
exception. There are a number of scenarios where opt-in doesn't make sense.
For example, imagine a system that used facial recognition in a retail setting
to detect shoplifters and alert management to their presence. If you had been
caught stealing in the store in the past, you would be enrolled into the
system by the retailer. Each person entering the store (as seen on their
already existing surveillance systems) would be checked against that database
of shoplifters. How realistic would it be for the retailer to get written opt-
in consent from the criminal?

3\. Previous sessions the consumer advocates had agreed that facial detection
was out of scope, but facial recognition and facial analysis were in scope for
the guideline. At the last session, they changed their position and wanted to
bring face detection in scope. The difference being, face detection is just
finding a face in an image, like your smartphone does in the camera app vs
face recognition which would be finding the identity of a face in a picture
(Who is this? or is this Bob?). Can you imagine getting opt-in consent to draw
this little yellow box around the faces in the camera app?

4\. The NTIA did an outstanding job in an attempt to bring this process
together. They organized the sessions and brought in numerous speakers from
both sides (consumer and industry) as well as from the education and research
sectors. We even had representation from Canada that spoke to a similar
process they went through there. The NTIA wasn't forcing any standards on
anyone, they simply offered their services to organize and facilitate it as
they have done in other similar multi-stakeholder initiatives. The process
went on far longer than the NTIA (and any of the participants) had expected,
but it was making slow progress.

I don't know if this process is now dead or not. I suspect it will go on, just
now the consumer groups won't have a voice in shaping this (by their own
choice), which is a shame. It's also possible that the show of force by the
consumer advocates will push the industry to bend more in their favor, either
way it would be good for consumers to get some kind of guideline out there. It
could always evolve and be revisited in the future as the technology changes.

~~~
belorn
Facial recognition software provides a probability that a record and a image
is of the same person and that process will without exception create false
positives. A anti-shoplifter system would thus have to have several safety
features in order to not turn into a no-fly list.

How would that system work in a fair society government by law? Would suspects
not found guilty in a court of law be there? How should false positives, that
is people who the computer incorrectly predict as someone else, get a chance
to clear their name (face) and get compensation when it happens? Should
criminals that have served their time still be locked-out for-life and denied
entry to stores that share the system?

Like vigilantes, I have a hard time seeing a way which such system can be
created without massive abuse. Let assume that 1% of scanned people will be
false positives, and say 5% of records will be falsely added by clerks (they
are not trained to be cops or judges, and some will abuse the system to
intentional hurt people they know). How many people would be hurt by this
system if it was shared by the majority of stores in a large town?

------
arca_vorago
This brings up something I've been wondering about for a while, the legality
of masks and other devices for identity protection. Supposedly due to protests
and anon activity I have heard of an increasingly push to make publicly
wearing masks illegal, but I think it's a slippery slope to go down. If in a
few years time I have to assume that if I'm in public my face is being input
into recognition systems, why should I not have the right to wear a mask or
something similar?

Of course the knee-jerk argument is that wearing masks (both digitally in
offline) creates a different personality which is more willing to engage in
illegal activity, but for me, a staunch Constitutionalist, that still doesn't
justify making it illegal.

This is, at the bottom line, about the reduction of anonymity equally in the
real world as in the digital.

One more tool of control in the belt of the oligarchy.

~~~
happyscrappy
Isn't it better to work on having a government you can trust rather than
fooling yourself by building a castle made of sand in the path of an incoming
tide?

~~~
cracell
Why not both?

We should retain the right to conceal our identity in reasonable ways and
governments should enact laws that protect their citizens privacy from
biometric recognition technologies.

~~~
happyscrappy
>Why not both?

Reality. Information leaks all over the place, trying to legislate your way to
privacy is like trying to bail out a boat with a fishing net. Public
information is by definition public, so we could all walk around with wide
brimmed hats and masks or we could be realistic about the challenges ahead.

~~~
cracell
What? With that attitude do you not believe in regulating anything? No laws
are perfect but that's not a valid argument against having them.

There's a big difference in a company legally operating a facial tracking
system and a company illegally operating one.

In the case of the illegal one they have to hide it, it's much harder to
profit from and thus would be less widespread. In the case of the legal one
they can freely advertise their services and easily setup their system
anywhere that local governments don't create hurdles.

~~~
happyscrappy
You can make all the foolish laws you like and there will still be data and
cameras everywhere in public places. Where the data comes from is irrelevant,
it will be available.

Edit: Are you going to ban cameras in public places? Facebook? Cellphones?
Satellites?

------
mVChr
> Communities such as San Diego, California are using mobile biometric readers
> to take pictures of people on the street or in their homes and immediately
> identify them and enroll them in face recognition databases.

In their homes? I'm pretty sure this is illegal. Or is it not if they are
visible _from_ a public area?

~~~
astrobe_
Doesn't matter. In their homes, or in their gardens, or when they are going
out and have one foot on the street.

Illegal collection can't reliably be prevented. Get a photograph, extract the
biometric data, enter them in your database, delete the photograph. Job done.

Maybe the only way to get control back on this is to require that these
databases to be open source (or publicly available) and make it illegal to
hide those databases. Citizens must also have the right to be removed from
those databases.

~~~
Adlai
At what point do we stop trying to prevent unpreventable illegal activity, and
try to put systems in place for dealing with its undeniable existence?

I'm not certain how such systems would look in this case (everybody wears Guy
Fawkes masks in public?), but this debate reminds me of prohibition vs. harm
reduction in the War on [People Using] Drugs.

~~~
jfb
I don't think the comparison holds. It's not like there're thousands of
generations of face recognition fighting against a wrong-headed prohibition;
this sort of panopticon garbage is a clear violation of societal norms. I
don't think you accept that misbehavior from artificial persons is a given;
even as we look for a technical solution, we should also be pressuring the
legislatures to amend laws to prevent this.

~~~
Adlai
Does the "Right to be Forgotten" cover the datacenter neurologic?

------
rlvesco7
Who are the companies that are pushing this? The article, surprisingly,
doesn't say.

------
kefka
Yes, and?

With OpenCV, I already have face detection, face recognition, eye detection,
cascade creation (for detection of any feature I wish), a retinal model in
conjunction with retinal scanning via webcam, and other power tools in that
bag.

And it's all open source. It's easy enough, I made it myself:
[https://github.com/jwcrawley/uWHo](https://github.com/jwcrawley/uWHo)

~~~
forgottenpass
Who cares if it's easy or not? Consumer/passerby protection rules are not
going to stop you, a person with nothing to lose. This is about the use of
such technology in a commercial setting.

It's a bit like my pirate radio station. I have it, I broadcast to a very
limited distance, whatever, I don't really worry about the white vans. If I
put a 400 kW TV transmitter on top of a mountain, that's a very different
story.

~~~
kefka
Lets take the same analogy...

So, if I buy/build a 1GP camera array, and drive it around the city, capturing
all the data, "that's a very different story"?

Your example is in violation of the FCC. Mine violates nothing.

~~~
forgottenpass
_Your example is in violation of the FCC. Mine violates nothing._

Well, yeah, that was kinda the idea. My point is that it doesn't matter what
is technically possible if consumer protection rules were to come into place.
The regulatory environment would curtail the largest (ab)uses of the
technology as entities that would use the technology wouldn't want to take the
risk.

edit: So, under hypothetical regulation on facial recognition, a watchdog
agency isn't going to know/care about your driveway camera, but your business
based on using that 1GP array to sell person tracking and demographic data
would raise some eyebrows.

Was your point about the ease of implementation an expression that you'd do it
anyway if the regulations came into place? Put your business on the line
because you personally know how to implement it?

------
stox
Actually, you can change the dimensions of your face. It is very painful, and
I do not recommend it to anyone voluntarily.

