
Facial recognition: It’s time for action - benryon
https://blogs.microsoft.com/on-the-issues/2018/12/06/facial-recognition-its-time-for-action/
======
geephroh
This is a laudable first step in advocacy for real regulation of a technology
that already has huge impacts on privacy and civil society. I was in the room
for one of the meetings with Microsoft's senior leadership as a representative
of a Seattle-based civil liberties group. While our coalition would like to
see MS go further, it was quite clear that they take their commitment to
corporate responsibility and ethics around the AI issue very seriously. In
particular, they seemed to understand our concerns about how facial
recognition technologies can magnify existing biases in our criminal justice
system.

This stands in stark contrast to the meeting I attended on Tuesday with
Amazon's general counsel regarding their Rekognition service. There was a near
complete rejection of the idea that mass deployment of surveillance
technologies in today's largely unregulated environment posed any danger to
civil society. He also denied that Amazon had any responsibility for the
negative impacts of their AI/ML technologies, or role to play in industry
efforts to self-regulate.

~~~
vorpalhex
> He also denied that Amazon had any responsibility for the negative impacts
> of their AI/ML technologies, or role to play in industry efforts to self-
> regulate.

For a period of time, there was a lot of chatter about all developers having
some kind of professional oath, like doctors. Many of the approaches that were
taken have issues (preclude working on smart weapons program or legal
surveillance).

I wonder if what we really need is a developers oath along the following:

"Anything I build can and will be abused. I am responsible for my designs, for
my products, and for the data I collect and store. If my technology is used
for evil, I am responsible."

~~~
meesles
I don't know if the oath comparison works out so well for development. Let us
not forget that each physician is directly responsible for a life every time
they see someone. They also have a more direct effect on the success or
failure of the procedure as they tend to produce a large portion of the work.

I'd argue few companies ever reach this level of risk, and those that do are
so large that the individual contributor cannot reasonably take on that burden
of responsibility.

In the example of some Amazon surveillance 'big-brother' software: Max the
software dev is just making facial recognition software to the best of their
ability. They aren't privy to the motivations, long-term plans, and potential
consequences of those decisions.

The oath is always a fun topic to discuss though: In reality it holds no
meaning other than to the one who takes the oath. Correct me if I'm wrong, but
in malpractice cases I doubt they cite the oath as evidence since all students
are essentially forced to recite it.

~~~
pixl97
No snowflake is responsible for the avalanche, therefore I must pile on as
much snow as possible -- The modern developer.

~~~
vallavaraiyan
But the snowflake doesn't know that it is falling on a mountain slope.

~~~
ben_w
Individual snowflakes are not capable of abstract reasoning.

Can we make it a requirement for ourselves to limit our power to our ability
to keep that power safe?

I think that’s a superset of the problem of incentive alignment in AI safety,
so probably not... but we also shouldn’t let the perfect to be the enemy of
the improved.

------
PaulKeeble
Biometrics are creeping into everyday life. One of my local gyms this week
switched to requiring fingerprints or you were barred from access. Another
local gym uses facial recognition for entrance, although you can choose to
have a member card instead if you ask for it directly, they don't list it as
an option.

Thankfully in the EU we have GDPR. It considers biometrics as a similar
sensitivity to medical data, so unless you genuinely need it (maybe a
hospital) then you can only get it with explicit consent. If consent is not
given then that can not bar you from service.

So I reported a company to the ICO this week for the introduction of
fingerprint scanners and was assured they consider it was a breach and will
deal with them. GDPR isn't perfect and I think defaulting to consent is wrong
and alternatives must be called out but you can't help people sleeping walking
into it, it is very convenient.

~~~
seanf
My gym also uses facial recognition, but does it via a person sitting behind a
desk checking that my member ID matches my face. I don't think many people are
uncomfortable with this process, and this biometric method has been used for a
long time.

~~~
Nikaoto
Yes, but that person can't copy your biometric data stored in their brain,
convert it to a standardized format, and distribute it to millions of other
devices.

~~~
seanf
You mean like a photo?

~~~
Nikaoto
They would have to use a camera, which you would have to allow.

~~~
nathancahill
Every gym I've joined snaps your picture when you sign up.

------
xg15
This sounds like a Classic leading-and-pacing piece to take the lead in
regulating a field before actual hard regulations are passed. So while demands
for restriction on government use are (correctly imo) very strict, for private
entities we get this:

> _From the moment one steps into a shopping mall, it’s possible not only to
> be photographed but to be recognized by a computer wherever one goes. Beyond
> information collected by a single camera in a single session, longer-term
> histories can be pieced together over time from multiple cameras at
> different locations. A mall owner could choose to share this information
> with every store. Stores could know immediately when you visited them last
> and what you looked at or purchased, and by sharing this data with other
> stores, they could predict what you’re looking to buy on your current
> visit._

> _Our point is not that the law should deprive commercial establishments of
> this new technology. To the contrary, we are among the companies working to
> help stores responsibly use this and other digital technology to improve
> shopping and other consumer experiences. We believe that a great many
> shoppers will welcome and benefit from improvements in customer service that
> will result.

But people deserve to know when this type of technology is being used, so they
can ask questions and exercise some choice in the matter if they wish. Indeed,
we believe this type of transparency is vital for building public knowledge
and confidence in this technology._

So they don't actually advocate that you should get a right to privacy or a
right not to be profiled once you enter a store.

Instead, you get a right to opt-out of profiling by not ever entering any kind
of store again.

~~~
taf2
Agreed it’s sales 101 help the customer write RFP.

------
dontreact
This is the first instance I've seen of the tech industry calling for
regulation. I admire this and the idea that people running a corporation can
understand that despite their best intentions, in the long run the corporation
will act to maximize profit via legal means, even if an action is not in the
best interest of society. And so in some cases we need to make certain things
illegal. I would love to see a company do this sort of thing for a broad set
of tax loopholes as well, for example.

~~~
johannes1234321
There's always the call for regulation - if a competitor has a product which
is ahead if you in some way wher you for whatever reason can't or won't
compete. See for instance Steve Cook's recent comments about GDPR and privacy.

~~~
marcinzm
In this case it's almost the opposite I think. MS has good technology and that
is why they would want regulation. If there's regulation then any potential
competitor will now have a much harder time creating a competing product.

------
qwerty456127
Governments are the first in the line to abuse facial recognition.

Now as CCTV cams are everywhere, everybody should have a right to wear a mask
everywhere without getting discriminated.

~~~
21
> _everybody should have a right to wear a mask everywhere without getting
> discriminated._

It's a losing game. They will track you with your mask on from the moment you
leave your house, your phone, your credit card, your gait, your car... It's
like a super cookie, if you don't delete all the 10 places it was stored, a
single missed one will be enough to regenerate it.

Total citizen surveillance is coming, everyone's location history will be in a
database and kept for years, just like phone call metadata.

~~~
qwerty456127
Nevertheless, when somebody has stolen my bike from a bike parking (where
there were no other bikes and no crowd) right under a security camera near a
fancy shopping centre the police couldn't find it, miraculously they couldn't
even find it on the camera record like if both the bike and the thief were
invisible.

~~~
marnett
most shops and private facilities have motion activated video to reduce
storage volumes. There is a massive variance in how good they are at actually
turning on and recording given some movement.

perhaps some city-owned CCTV cams are always on, but I'd be doubtful.

------
spanktheuser
I'd like to see some real teeth in their implementation. For example:

* Rather than merely forbidding biased uses in their TOS, an internal team should review relevant source code & use cases of anyone implementing MSFT facial recognition, a'la Apple's app store.

* Build apis, libraries and easy-to-use tools that allow consumers to destroy their face data.

* Increase the concentration of pressure on Amazon by refusing to engage in the race to the bottom. Specifically, refuse to license facial recognition technology to law enforcement, military, or intelligence agencies until such time as they have independent civilian oversight, direct neutral-party monitoring, transparency, and demonstrated accountability for mis-use.

MSFT (and any corporation) is fundamentally untrustworthy. Principles are
easily changed or ignored. Instead, they should begin creating institutions,
code, and business process that make abuse difficult. Testing tools and APIs
are the right idea - more of this approach please.

------
samstave
Facial recognition should be expressly illegal. Period.

As should license plate surveillance.

And for those that think that license plate surveillance should be legal, what
you do not know is that municipalities are mandating that private corporations
install license tracking cameras on their facilities and report back to the
municipality who is driving by that address. Menlo park is just one such
municipality.

~~~
AlexCoventry
It's a computation. Laws against it are unenforceable in general, absent
draconian restrictions on computing devices.

~~~
samstave
This is HN. I fucking get it.

Facial recognition surveillance technology deployed in any public sphere
should be expressly illegal. Period.

~~~
lokidokiro
You clearly don't get it. What if facial recognition technology can be
replaced with body recognition technology? Do you ban all forms of recognition
technology? That would criminalize a lot of legitimate uses for machine
learning.

------
mc32
This is where the ACLU should spend their time and effort rather than their
political meanderings. This is politically neutral, it affects everyone and
Americans should enjoy some basic rights in this realm. Govern how private
entities use this tech and data. Regulate official use so it’s not abused,
etc.

~~~
johannes1234321
Political neutral doesn't exist. Regulation invades freedom of corporations. I
support that, but I'm also left of the center. (For some definition of left
and some definition of center)

~~~
mc32
Yes, I suppose. I still think this would have bipartisan support in congress
as well as support from the public, regardless of affiliation.

------
tgsovlerkhgsel
"The law should specify that consumers consent to the use of facial
recognition services when they enter premises [...] In effect, this approach
will mean that people will have the opportunity to vote with their feet"

What this really will mean that in effect, facial recognition will be
widespread, legitimized, and unavoidable unless you want to live like a
hermit, just like CCTV today. The only way this could potentially be avoided
is targeted protests at the first stores adopting it.

The post does have some laudable positions and arguments against government
surveillance using facial recognition, but I'm not sure how useful this is if
private actors build even more powerful databases and offer them for sale to
the highest bidder.

------
sigacts
This is the same guy who recently vowed to provide any/all of MSFT's AI
technologies to the Department of Defense
[https://www.google.com/amp/s/www.nytimes.com/2018/10/26/us/p...](https://www.google.com/amp/s/www.nytimes.com/2018/10/26/us/politics/ai-
microsoft-pentagon.amp.html)

------
colordrops
I don't understand how this could be realistically regulated. It's passive
technology and its use can't be detected. It's like trying to stop people from
thinking.

~~~
vertexFarm
You could say the same about a credit card skimmer on an ATM. It's passive,
right? It's just a sensor that sits there and absorbs people's data. I don't
think this argument makes a lot of sense.

The point is, there needs to be some law and order in place so that when
people abuse this tech to harm people and society and get caught, there's some
precedent to stop them and punish them. It doesn't matter if the technology is
passive. The intent and action to use it to harm people are not passive and
are not analogous to thought crime at all.

There is dire need for regulation with a lot of emerging technologies right
now. We're building systems with enormous power which can break human society
if misused. I think the intangibility and "passivity" of this tech (or at
least how it is perceived) gives us a very false sense of security. Like how a
few decades ago very few average people could understand how the internet
might have a great impact on society. Obviously they aren't thinking that way
anymore.

Check out Charles Stross's speech at C3 about regulatory lag relative to the
the accelerated nature of tech growth:
[https://www.youtube.com/watch?v=RmIgJ64z6Y4](https://www.youtube.com/watch?v=RmIgJ64z6Y4)

~~~
Nasrudith
The passivity sense is in that it doesn't do anything /to be stopped/. The
installation of the credit card skimmer is what is illegal along with the
theft involved.

It is like declaring your city nuclear weapons free when the only players are
either above the law by jurisdiction or within detonation range already. Just
having the law on the books makes the city look stupid.

Facial recognition is a process that works on images - that makes it more
passive than even a sensor since there are definite precedents for 'not here'
with sensor recordings.

------
OliverJones
Without teeth--without meaningful consequences for violations--laws and terms
of service are meaningless.

What will MS do when their terms of service are violated?

A regulation forbidding use of this tech for discriminating against certain
people is worthless if it specifies a $10/day fine. Penalties have to be
significant, and life-changing for violators.

For example, HIPAA / HITECH specifies criminal penalties, and pierces the
corporate veil, for intentional violation of patient privacy.

Both are important. 1) the penalties have to be criminal, not civil. 2)
natural persons (not Romney persons) who break the law must not be able to
hide behind the limited liability of corporations.

A third step would be a bounty system for citizens bringing charges. The same
thing made the Clean Water Act enforceable in the 1970s-1980s.

Without enforceability like this, it's all chin music. Or even greenwashing.

------
mLuby
Think about the participants in a facial recognition system:

\- Targets (people)

\- Enablers (tech companies)

\- Stalkers (consumer companies)

\- Big Brother (governments)

Notably the only ones who cannot use the system are the Targets, because they
don't have the necessary scale. Being part of a system you can't use is
typically to your detriment.

~~~
bsenftner
FR is not out of reach for ordinary people. The algorithm of choice by the NSA
is available as a cost of about $1200 per seat, and it only needs an ordinary
PC to operate. This is no longer rocket science...

~~~
notjoemama
"$1200...ordinary people...needs an ordinary PC"

I'm guessing you had the US or Europe in mind rather than other parts of the
world like Argentina, Uganda, China and India. (ignoring the omission of
scale)

------
Nasrudith
Personally I think facial recognition is a sign of an underlying problem -
evidentary standards and liability. We have already had people effectively
murdered by bad evidence from win seeking prosecutors taking any sort of
psuedo-science voodoo they can get like bite mark analysis. False positives of
any sort like from facial recognition should be no different. Similarly AWS
thinks that many members of congress are already in jail. Something that bad
had no place being sold.

I could see holding the providers liable for bad detection a good precedent
for calibrating caution in a rational way although it is a bit 'eye for an
eye' for societal liking. Having something which if it recognizes a face 60%
of the time saying "Hi John" to Bob isn't a liability to anyone - really just
amusement. Locking someone out of their apartment and causing them to need to
call a locksmith because they got a bloody noise and facial recognition
doesn't work any more is low stakes. Having someone potentially jailed for a
long time would bring the intervals of confidence appropriately tight if say
the prosecutor were at risk of death row or 300 consecutive years sentencing.
We would see people very reluctant to work in forensics or prosecution if that
were the case.

------
pasbesoin
When someone follows you around constantly, lurking and observing your
actions, that's called stalking. And it's illegal.

We are to the point where commercial entities and governments are full-on
stalkers. And that should be illegal, too.

Further, it is immoral. There is a fundamental hypocrisy when people are
exhorted to autonomy and personal responsibility -- "by your bootstraps",
"entrepreneur", "gig economy", etc. -- at the same time they are, with mass
surveillance, being left with none of this, in truth. Your every action
monitored, measured, standardized, and compelled to _conform._

You are left with no agency, save that granted -- left -- to you by the powers
that be.

And with everything recorded and stored, seemingly indefinitely, you become
self-monitoring, self-constraining. Will this be used against me in a year?
Five years? Does one slip or oversight last a lifetime?

Fy the way, do you see them taking action to swing the cameras, and the
monitoring, the other way? Even Obama, with ever and ever more secrets and
aggressive prosecution of whistleblowers. The police, who have fought cameras
and monitoring for years. NDA's left and right, disparagement suits. On and on
and on...

And, I've gone on too much, here.

Never mind just the philosophy of the matter; look, too, at how it works in
practice!

Do we all want to spend not just our work hours, but our lives, in virtual
cubicles?

The post-modern panopticon.

------
matchagaucho
There are already several anti-bias and discrimination laws on the books. Why
does facial recognition warrant additional regulation?

Numerous processes already capture gender, race, and age.

Facial recognition seems like a better/faster tool for capturing these data
points. But the requirement to comply with existing laws is unchanged.

Any further regulation will only limit the development of facial technology to
a few large players that can afford compliance and enforcement measures.

~~~
int_19h
> There are already several anti-bias and discrimination laws on the books.
> Why does facial recognition warrant additional regulation?

Mainly because many companies doing it are arguing that when their models
produce biased results, it's not their fault, it's just "computer thinks that
way". So far as I know, this approach hasn't been properly tested in court,
but it might just fly, if courts decide that you need to have intent to
discriminate (and that training on real-world datasets, that are always
implicitly biased, does not constitute such intent).

------
numair
(Re-posting my comment on this link from yesterday)

This is really interesting. I wonder how much of this is real, versus PR
(although, Brad Smith has an excellent track record in this area). The company
that has the most to lose, were there to be real regulations concerning facial
recognition, is actually a company in Microsoft’s investment portfolio. That
company has built the world’s largest database of face and identity
information. Facebook.

------
vkou
This is a time for action, not just for facial recognition, but for every
algorithm.

Every decision made by an algorithm should make its inputs clear, its criteria
for interpreting those inputs clear, and its judgement to be disputable. If
you can't get your black-box neural network to do so, then perhaps it
shouldn't be making life-changing decisions for other people.

There's a startup that's doing sentiment analysis of social media posts, to
measure how 'risky' a babysitter is - ad likely does so in a biased manner.
[1]

It's illegal to use such a system, for purposes of vetting an employee, yet
their entire business model revolves around families using them to vet
babysitters.

[1] [https://gizmodo.com/predictim-claims-its-ai-can-flag-
risky-b...](https://gizmodo.com/predictim-claims-its-ai-can-flag-risky-
babysitters-
so-1830913997?fbclid=IwAR0k70VnKmz5ziUnm7Pu1RrB0oJB75tXNvgAk8OPUiiCwyE-6uldLYVvi44)

------
mimerme
_Enabling third-party testing and comparisons.New laws should also require
that providers of commercial facial recognition services enable third parties
engaged in independent testing to conduct and publish reasonable tests of
their facial recognition services for accuracy and unfair bias. A sensible
approach is to require tech companies that make their facial recognition
services accessible using the internet also make available an application
programming interface or other technical capability suitable for this
purpose._

Does anyone else find this interesting? Is Microsoft trying to keep their
facial recognition algorithm on top by comparing their's to others'?

------
simplysimple
blatantly obvious attempt at regulator capture is obvious. If there's anyone
I'd guess is already doing what they are warning against in this "open letter"
\- it's MSFT.

------
freeflight
Maybe I'm a bit too cynical, but I guess MS isn't that heavily
invested/successful in facial recognition technology, that's why it can
posture like that?

Because this feels a bit reminiscent of the "G-man" campaign, which might have
been fun and had a point, but a point that seemingly got lost along the road
of "Windows 10 as a service/storefront".

At least gotta give it to MS PR: They seem to know what's on peoples minds and
are rather good at trying to appeal to that.

------
Beefin
So MSFT was actually ranked the most accurate facial algorithm company in the
world by NIST. It’s interesting they’re just now talking about this (after
these results were published)

~~~
sib
That's the perfect time to call for regulations to make it harder for your
competitors to catch up.

cf Regulatory Capture
([https://en.wikipedia.org/wiki/Regulatory_capture](https://en.wikipedia.org/wiki/Regulatory_capture))

~~~
Beefin
Brilliantly malicious

------
GarrisonPrime
Microsoft gains the social kudos for speaking up (vaguely) about some "issue"
or another. Meanwhile, actual development of the scenario will occur
regardless.

------
tareqak
I understand there is some overlap and some divergence between the perils of
poor privacy practices and pervasive facial recognition, but doesn't
addressing the former help with the latter somewhat?

It might seem like I'm trying be on the side of the perfect against the good,
but there is room for both efforts without stifling either. A holistic
approach to privacy in general would help inform the values necessary towards
the responsible use of facial recognition technology.

------
cmsonger
The TLDR, MSFT says "You can't trust any of us so you better regulate us now."

... or they are just mad that I refuse to give them a linkedin photo. :)
Supposed to be funny, but also serious. Every time they ask for one, and then
ask why not, I tell them because they are fundamentally untrustworthy. And
they are untrustworthy, as is every public for profit business whose officers
carry a fiduciary responsibility to shareholders.

IMO this position is well written and these two sentences succinctly
articulate the situation:

"In particular, we don’t believe that the world will be best served by a
commercial race to the bottom, with tech companies forced to choose between
social responsibility and market success. We believe that the only way to
protect against this race to the bottom is to build a floor of responsibility
that supports healthy market competition."

~~~
bsenftner
This is MSFT saying "we can't compete here, and can't apparently buy the
leaders, so we must legislate because We MSFT is not in control."

------
arendtio
Cool! How about Microsoft taking their own advice to their heart and build
services (e.g. Cortana) that run on-device?

------
mpolichette
This is really interesting and I agree, but I think _only_ addressing facial
recognition is missing the forest for a tree.

There is an underlying idea here of a person owning information about
themselves and having control over it and making sure a company can not use it
inappropriately. I think addressing it as only 'facial recognition' wouldn't
go far enough.

~~~
wmf
They're not missing anything; by regulating hot topics individually they can
prevent sweeping GDPR-style privacy regulation.

~~~
mpolichette
I am not convinced GDPR is a bad thing yet... What makes you think it is?

~~~
wmf
I like GDPR but it's probably bad for Microsoft.

------
jameslevy
Part of the problem with "algorithmic accountability" is that you need a way
to verify that the software that is being used is identical to the one that
has gone through the audit. With open source software you can do this with
checksum verification. Is this type of verification something that any AI or
facial recognition software has provided?

------
jacksproit
A bit odd coming from a company that harvested so much facial data, no?
[https://www.lifewire.com/website-that-can-guess-your-
age-348...](https://www.lifewire.com/website-that-can-guess-your-age-3486143)

------
jonthepirate
I would love to have an alarm go off every time someone enters my mailroom
wearing a hoodie with with hood drawn over their head after 10pm. Do we have
that technology yet? I would be your customer.

------
trumped
facial recognition is probably worst because it is harder to avoid, unless
your are Muslim, but cellphone tracking is very bad:
[https://www.theregister.co.uk/2018/12/05/mobile_users_can_be...](https://www.theregister.co.uk/2018/12/05/mobile_users_can_be_tracked_with_cheap_kit_aka_protocol/)
(bluetooth and wifi are also pretty bad)

------
jeffrogers
The same should be said of the online advertising industry.

------
Animats
From the article: _" The law should specify that consumers consent to the use
of facial recognition services when they enter premises or proceed to use
online services that have this type of clear notice."_

So that's what Microsoft really wants. Allowed everywhere, minimial notice, no
user ownership of data, and no opt-out.

------
mothsonasloth
Whenever this topic comes up I think about Judas Priests "Electric eye" song

------
CobrastanJorji
This is great. I congratulate them and thank them for outlining specific
policy objectives.

My immediate next thought, though, is that Microsoft operates its "Cognitive
Services," including facial recognition, in China. That's worrying, even if
Microsoft would loudly prefer that governments generally pass nice privacy
laws.

------
bkmeneguello
"Let's make something new, which not even the scientists know where will lead!
But first let's create a ton o bureaucracy and regulations so only a few super
giant tech companies can participate to make these regulations with no further
intentions..."

------
alexnewman
as someone born in america it amazes me that america sticks its head in the
sand while china creams them through embracing tech. this is old tech that is
already underused in america

------
sbhn
You need a catchy slogan. “Your Face, Somebody Elses Money”

------
mbrumlow
I have to ask. Are those crazy wireframes really part of how facial
recognition works? If so are these real for those people in the photo?

------
jillesvangurp
This document is very thorough; yet it misses a key point. IMHO, the primary
concern here is not the algorithms that produce outcomes and regulating those
but what happens to the resulting data streams as well as the raw input for
the algorithms. Simply storing huge amounts of video footage means you can
later run algorithms as they improve or become available. It's the act of
storing and exchanging data that is problematic not running algorithms against
that data. You might even capture data before algorithms are
invented/available to process that data or use new algorithms against
historical archives of data to extract new information.

What are the implications of the right to be forgotten on e.g. security camera
footage that is stored for weeks or months or even longer. The raw data
contains personally identifiable information, given the right algorithms and
access. Mining and correlating information from raw data that so far is
considered innocent changes the game completely. Somebody with access to isp
data logs, cell tower logs, and security footage in buildings, on streets,
etc. can start looking for patterns in that data and cross verify and
correlate events between data sets.

Recent history in China teaches us that this is neither science fiction nor an
imagined risk. China is quite openly implementing mass surveillance using all
means available to them and they are already using it to control their own
citizens.

IMHO the focus for legislation should be on legislating not the right to be
forgotten, because that is inherently hard to enforce and impractical, but
instead on ensuring data capturing parties stick to strict rules with respect
to auditing and securing access to that data, retention, transparency of their
policies regarding all of this, and making sure serious violations have
consequences.

Additionally, any surveillance data used in a court of law needs to have an
impeccable auditing record proving it was captured and handled lawfully. One
critically important aspect here is also ensuring the data has not been
tampered with in any way: algorithms to falsify information are also becoming
a risk. Capturing data that may be used for privacy invasive analysis creates
a duty to adequately handle and protect that data. This should not be
optional.

The key to enforcing this is sousveillance: observing the observers. This
should be both legal and common. In a world where there is multi lateral
surveillance by multiple groups of people, companies, and governments, hiding
undesirable behavior/actions is going to be increasingly hard and you can
never be sure that nobody is watching whether you are a citizen, chief of the
secret police, or a head of state. Surveillance is a double edged sword and
abusing it might be observed by others. The 1984 type scenarios always assume
a single evil government doing all the surveillance in secret. So, we should
make sure governments themselves are equally exposed to surveillance in order
for us to ensure that they do not abuse their level of access to surveillance
data.

------
nmcfarl
Are

------
herrosheep
There needs to be a device that you can jam under your chin to disfigure your
face for some period of time. Granted this would not be legal or pain free,
but might be required in special situations like misidentification. I could
absolutely see a device like this being sold on the black market in the
future.

~~~
bsenftner
All one needs to do it wear a hat with a blinky LED light, or a illuminated
eyeglass frames: the cameras try to compensate for the bright, and cause your
face to have degraded color / illumination space. This will kill almost all FR
algorithms because it destroys the image inside the camera before the FR
algorithm can begin.

------
partycoder
Too late for China.

~~~
justicezyx
These concerns also apply to China at least to a certain degree (or to a
lesser degree from a Western centric perspective) and different emphasis.

The point is: whatever being asked for here is to advance a technology for the
greater good of the society, and that's government's responsibility.

It has little to do with China or geopolitical tensions. At the end of day,
everyone nation needs to choose the right path for their own. If they believe
some other nations are threatening their own ideology, the should build
defense or sometimes offenses to counter that.

In summary, this statement is about useless as a random chant of "communism is
bad".

~~~
vertexFarm
Kind of squirrely phrasing going on here. "If they believe some other nations
are threatening their own ideology, the should build defense or sometimes
offenses to counter that."

They "believe" their ideology is threatened--so it's okay to clamp down on
their own citizens, representing over a billion human beings, censor and
intercept all communications, edit out government atrocities and historical
facts like Tienanmen square, and spy on every citizen on a profound and
constant panopticon level? This isn't about western or eastern perspective.
This kind of philosophy justifies the actions of any dictator or autocratic
fascist. This whole obsession with a perceived national threat is the prod
that pushes through all sorts of oppressive policies.

We in the west also have severe problems with this. The whole post-9/11
mentality has been all about yielding personal dignity and rights and basic
privacy for the sake of assuaging some vague (and mostly imaginary) threat. We
also are building a massive panopticon, we just aren't as far along as China
is at the moment, and our machinations of totalitarianism are privatized
instead of powered by the state. We should pay very close attention to what's
going on in that culture, because we may need those lessons sooner than we
think in our own future--even if they come in a slightly different form.

~~~
justicezyx
> This isn't about western or eastern perspective. This kind of philosophy
> justifies the actions of any dictator or autocratic fascist.

You read to much into my statement.

What I said is that: \- US and China will have their own strategies for their
own ideologies. \- They might clash if one or both think the other will
threaten their ideologies.

I said zero about which is legitimate or a standard to make judgement.

As for the complain to the OP. The statement is contextless. I was merely
extend along the most likely line of thinking based on my experience.

------
trophycase
I could've sworn just a few days ago I saw a story about MS working with the
government for such things.

~~~
cmsonger
That's probably partly why. They understand up close and personal what they
are being asked to do and the money they can make for doing.

------
laythea
Hang on, Hang on. Let's not get too carried away here. The big corps are all
scared after what has/is happening to facebook, and this is an attempt at
trying to "go down the unsteady river together" with the authorities.

Now, in an ideal world, where governments are ideal, then this is a good thing
for the population, as the government theoretically represents the people.

However, the government is run by people and is not ideal. This is not a good
thing, and will allow all sort of actual abuses of power occur, in the long
run.

I read this as the tech industry trying to get the gov. in with them also.
(even more than already so), in my opinion.

~~~
porphyrogene
Brad is very specific about the problems with facial recognition and he
generalizes an approach to regulation that seems sound. The argument that the
government is always going to abuse power and should therefore not attempt to
regulate anything is a rallying cry of politicians who were hired to run for
office to stop regulation at all costs. There is skepticism and then there is
cynicism. Skepticism serves the purpose of finding realistic solutions while
cynicism serves the purpose of finding a reason to justify doing nothing. When
someone in government proposes that a problem be solved by doing nothing you
can be sure that they don't really work for the people. We can't fix
environmental problems by shutting down the EPA, we can't fix public schools
by reducing their funding and when a company reaches out publicly to talk
about regulating their own industry we can't prepare for the future of that
industry by telling them to stay in their lane.

------
SlowRobotAhead
>We believe it’s important for governments in 2019 to start adopting laws to
regulate this technology.

 _We worked on facial recognition for decades, recently added it to our
console and Windows10 Hello, we built a decent API into our cloud.... but um
now we need governments to stop us, please?_

Ok, I mean... I guess I don’t understand big company speak. I didn’t see
anywhere they are shutting down their own offerings.

The concerns for _“race to the bottom”_ of tech are interesting. A cynical
person might think MS was plenty ok with it when they had little competition
but now that just anyone can do it they see the problems, as just a sign of
self-interest.

Not that I disagree facial recognition can be abused, it will if not already
is. They got it right with genie in a bottle. I don’t trust government to
regulate themselves and expect the largest abuses from them. I highly suspect
my time this year in Singapore was ‘heavily calacuated’ everywhere I went.

~~~
zamalek
Regulation isn't the same thing as prohibition. Biometric sign-in wouldn't be
covered by their suggestions. This is probably because biometric sign-in is
not inherently abuse of AI/ML. It certainly can be though, but covering that
would require prohibiting AI/ML in its entirety, even at home or in academia.

~~~
SlowRobotAhead
>Regulation isn't the same thing as prohibition

True, but it’s almost always the first step in prohibition. Which isn’t to say
I think prohibition possible in this case. Just that asking for government
involvement is at very best a double edged sword.

As far as I know, the Azure Facial Recogition API goes well beyond signing in
to things.

~~~
geephroh
I would recommend reading Oren Etzioni's piece in this month's Communications
of the ACM, "Should AI Technology Be Regulated?: Yes, and Here's How."[1] He
lays out some very specific, practical ways that regulation can work, while
minimizing impacts on innovation.

[1] [https://cacm.acm.org/magazines/2018/12/232893-point-
should-a...](https://cacm.acm.org/magazines/2018/12/232893-point-should-ai-
technology-be-regulated/fulltext)

