
IBM no longer offers general purpose facial recognition or analysis software - TakakiTohno
https://techcrunch.com/2020/06/08/ibm-ends-all-facial-recognition-work-as-ceo-calls-out-bias-and-inequality/
======
strogonoff
The rumour of IBM dropping “all facial recognition work” is unsubstantiated,
despite making its way into industry’s headlines.

Krishna’s letter is here[0]. IBM will cease to sell related products and
services. One might speculate they might resume sales once there are strong
regulations and limitations in place.

[0] [https://www.ibm.com/blogs/policy/wp-
content/uploads/2020/06/...](https://www.ibm.com/blogs/policy/wp-
content/uploads/2020/06/Letter-from-IBM.pdf)

~~~
saint-loup
Quoting from your link:

>>> IBM no longer offers general purpose IBM facial recognition or analysis
software. IBM firmly opposes and will not condone uses of any technology,
including facial recognition technology offered by other vendors, for mass
surveillance, racial profiling, violations of basic human rights and freedoms,
or any purpose which is not consistent with our values and Principles of Trust
and Transparency. We believe now is the time to begin a national dialogue on
whether and how facial recognition technology should be employed by domestic
law enforcement agencies.

Even if the phrasing is limited to "general purpose" and "software", it still
seems a pretty strong stance to me.

~~~
rurban
I tend to start liking them more and more. Power as only unbackdoored modern
CPU, and now this. Seems to be a big marketing push, in the right direction.

~~~
wrkronmiller
> Power as only unbackdoored modern CPU

Can you provide more details/proof?

~~~
jedieaston
I think they are referring to it not having Intel ME or TrustZone (which is
present in AMD chips and ARM chips), which could allow the processor
manufacturer to run code that you don't know about. Of course, there's no
Power systems for consumer use, so we're kind of stuck with Intel/AMD (until
Apple releases those ARM Macs, which probably can't boot linux :( ).

~~~
thinkmassive
Raptor Computing Systems makes the Talos II and Blackbird, both with POWER9
CPUs and available for purchase by consumers today.

------
caymanjim
There are a number of cynical comments here about how they weren't making
money on the technology and are just announcing this for PR reasons. Well,
maybe, but isn't that sort of cynical response even worse?

I'm rabidly against the use of facial recognition on unwilling subjects,
whether it's a government actor (by far the most oppressive use) or a
corporate actor. I'm rabidly against public space cameras. I want to see this
technology die and never return.

We all love to pounce on companies for doing things we don't like. Why don't
we celebrate this as the victory it is? Of _course_ there's a PR component
here. Why wouldn't they make an announcement? Why wouldn't they do it now when
the audience might be more receptive to the idea? The fact that they weren't
making money off it is unlikely to be the only reason they're canceling it.
IBM plays the long game, and there's absolutely a market for this technology.
A huge and profitable market. They could have kept at it and turned a profit.

So, yeah, they're trying to make some hay, but not every corporate action is
purely cynical and evil. Let's appreciate that they've made a positive change,
and let's hope that it increases awareness of a horrible technology, and puts
pressure on the more egregious actors like Amazon and the defense industry. We
don't have to pat IBM on the back, but we can cut them some slack.

~~~
neximo64
It's just software, if an actor wanted to use facial recognition they can
write it themselves. Isn't the correct attack vector regulation vs having a
company pull its products? If IBM isn't doing it someone else will and it'll
just be less competition, meaning a better funded product for the industry
winner.

~~~
caymanjim
The primary customers for this technology are governments. They want it.
They're not going to regulate it away from themselves. At best they'll
regulate the far less dangerous civil use so as to pay lip service to concerns
and amplify their "think of the children" misdirection. They can't write it as
quickly or as effectively themselves. Governments don't write software. They
pay IBM, Amazon, and Lockheed to do it. If most vendors grow a conscience (or
fear consumer retribution), then no one writes it. Or it becomes prohibitively
expensive. Or it doesn't work as well (which isn't necessarily always good,
but is in this case, because it will reduce public support).

~~~
rurban
All governments already are using the best in class, from a German company,
Cognitec. Not much chances for Northrop or IBM to make an impact there.

------
DuskStar
This reads like "we're behind, not catching up, not making money, and really
need an excuse to drop this" to me, but I might just be too cynical.

~~~
coderintherye
I will say that 3 years ago when we tested all the major vendors' image
recognition technology, they all failed spectacularly. And this wasn't some
small project, we ran tens of thousands of our images through (out of over a
million) and worked with several Phd students in our efforts.

A picture with a white college kid in the frame would get the output of
"human". Put an African borrower in the frame and you got at best a failure to
recognize a human, and at worst a reference to an animal.

I would hope the situation is much better now, but the bias (and just sheer
inaccuracy) of the tools was readily apparent and we gave up on image
recognition for the time being.

~~~
formercoder
There are real technical challenges here. Lighter skin has more contrast
between light and dark.

~~~
centimeter
White people also have higher facial variance in general. I vaguely remember
in university we had an assignment to generate “eigenfaces” or something and
if you partitioned the faces by race the output of the SVD would be much wider
for white people. This isn’t especially surprising when you consider the fact
that white people have more light/dark contrast, more hair colors, more eye
colors, etc. I think a lot of the “bias” complaints levied against algorithms
like this are not bias at all, but just humans who are unhappy the world
doesn’t quite live up to their idealized model.

~~~
rockarage
When you sample from a smaller pool you will make uninformed statements like
this. Black/Dark people of the world are not limited to Black people in
Atlanta. Bet many people here do not know that there are Black people in this
world, some who have naturally blonde hair and some who have blue eyes, google
it.

Moreover capturing Black/Dark skin and features requires more accurate light
metering & lighting because dark skin absorb more light. There's a lot
variance in cheekbones, nose and lips.

Humans' features in general, are more complicated then you realize.

~~~
throwaway391003
Ok but that can still be true (blonde hair, blue eyes) while there still being
much more variations in the white population than in the black population.

I'm curious how many white people there are on earth vs how many black people
there are, and other races. A couple google searches didn't give me any easy
finds

~~~
claudeganon
That’s because “white” and “black” are loose, shifting, ideological
constructions with little basis in the scientific reality of human genetic
variation. Many “white people” weren’t considered “white” until fairly
recently and Africa actually has more human genetic variation than anywhere
else.

~~~
centimeter
> little basis in the scientific reality of human genetic variation

This meme dates back to a loose claim made by R. Lewontin back in the 70s. In
fact, you can very precisely and reliably recreate the "intuitive" human
racial categorization using unsupervised algorithms, like doing multi-
dimensional clustering over fixation indices. (It does not work using single-
dimensional clustering, which is what Lewontin was talking about.)

Modern biologists usually talk in terms of clines rather than races, but this
is just using the first derivative instead of the zeroth - you'll get the same
result either way.

> Africa actually has more human genetic variation than anywhere else.

SNP diversity has ~nothing to do with phenotypic variance.

~~~
claudeganon
Of course the question here is recreate whose “intuitive racial
categorization” because all of that is historically and culturally specific.
Saying it’s possible for a computer to recreate these categorizations presumes
that the categorization has some objective reality outside of this when
they’re just a variable heuristic determined by all those inputs.

~~~
centimeter
> all of that is historically and culturally specific

Not really - almost everyone can agree on "middle eastern/north african",
"east asian", "south asian", "black african", "white", etc. If you force
people to pick a single-digit number of major categories, they're probably
going to come up with the same categories that k-means in fixation space
would.

~~~
claudeganon
This is an evidence-free supposition, consistent with your pattern across this
thread of making broad claims without anything to support them. You’ve
provided no proof that k-means on a representative sample of phenotypic
variation in the groups you cite would return this result.

That almost everyone can agree on these categories is also contrary to
reality. For example, many of who you describe as East Asians consider
themselves racially distinct both within their societies and from their nearby
neighbors. Also, what major categories do mixed race people fall inside?

~~~
centimeter
It's not my job to provide detailed proof on every HN post I make; I'm just
pointing out something relevant, and if it interests you, you can go ahead and
find where people have already done this. I think I've been specific enough
that you can find this stuff on your own. This took me about 1 minute to find:
[https://www.discovermagazine.com/health/to-classify-
humanity...](https://www.discovermagazine.com/health/to-classify-humanity-is-
not-that-hard)

> many of who you describe as East Asians consider themselves racially
> distinct

That's why I specifically mentioned the _number_ of racial categories
involved. Obviously as the number increases you can have different clustering
results.

> Also, what major categories do mixed race people fall inside?

Obviously not into any of them, if we're talking about a simple mechanical
classifier with high separation.

~~~
claudeganon
The number of racial categories would itself be an arbitrary limit not
corresponding to actual genetic variance, nor would classification under such
limit capture said variance, and none of it would match up to the folk biology
of racial categorization. This is the general problem with reasoning backwards
from 19th century gobbledygook about human genetic variation instead of
beginning with the genetics themselves.

It may not be “your job” to provide such evidence, but you’ve made a series of
specific claims about things like the rate of phenotypic variance among
different racial groups. If you don’t want to defend them, that’s your
prerogative, but you also can’t expect them to be received as authoritative or
remain free of challenge.

------
zoomablemind
This headline reminded me of the "Chicago PD" recent episode called "False
Positive" (S7E6) [1]. A new id system is pushed into a resonant case by a
police chief. The system's merits are touted as being 'strongly condemned by
ACLU'. Yet still in beta..., people are shown on screen as just a collection
of dots. Virtual code lines that affect real lives.

[1]:
[https://m.imdb.com/title/tt10691948/](https://m.imdb.com/title/tt10691948/)

~~~
seltzered_
Somewhat related, there was a recent frontline documentary that delved into
this:
[https://youtu.be/RVVfJVj5z8s?t=5000](https://youtu.be/RVVfJVj5z8s?t=5000)

------
thinkloop
> a product that is similarly just barely good enough to use.

I thought facial recognition was advanced (mainly based on genpop articles),
but isn't China using it massively with success?

~~~
reaperducer
_isn 't China using it massively_

Yes

 _with success?_

There's no way to know.

~~~
winrid
Based on a month in China, and having relatives there, I'd say very much yes
to the second question.

If they want, they can trace your whole day from start to end in a city.

------
P-NP
A striking difference to what's happening in China where facial recognition
software made SenseTime the world's most valuable AI unicorn: "However, facial
recognition does not seem to have been making the company much money, if any.
To be fair the technology is really in its infancy and there are few
applications where an enterprise vendor like IBM makes sense."

~~~
villahousut
I guess China has some money-worth applications for face recognition which
luckily there isn't demand for in Western countries

------
PeterisP
I think the cat is out of the bag. There's enough public datasets and
published methodologies that are relatively simple to implement, that quite
usable facial recognition software is within the bounds of an undergraduate
homework project. Sure, IBM can probably make it more accurate, but
nonetheless, if somebody wants to make a tool that does e.g. ethnic profiling,
then they can do it without the help of IBM, the techniques for solving
similar vision tasks are known and people who can do it are widespread.

~~~
regularfry
The relevant difference between IBM and the undergrad is that the latter will
have a hard time pitching to governments.

~~~
PeterisP
Any interested government department where a manager wants this can hire a
random graduate that can implement this for them, it could be literally be a
one-man project with a trivial budget. There's no multimillion purchase pitch
required, a regional niche department can do it in their own kitchen without
involving the rest of the government using spare change they have to spend
until the end of the fiscal year in order to not get the next year's budget
reduced.

------
kgin
I understand how ml can replicate existing cultural bias in recommendation
systems or risk scoring systems, but how does bias work in the context of
facial recognition?

~~~
Barrin92
technically much lower accuracy on African-American and Asian populations (in
the US)[1]. More importantly technical issues aside the primary use case of
facial recognition seems to be the policing of minority or vulnerable
populatoins and erosion of privacy more broadly, which tends to hit minorities
the worst.

[1][https://www.wsj.com/articles/facial-recognition-software-
suf...](https://www.wsj.com/articles/facial-recognition-software-suffers-from-
racial-bias-u-s-study-finds-11576807304)

~~~
partiallypro
But isn't the bias mostly just because of the lack of data? Asian facial
recognition works very well (or very badly, depending on your perspective) in
China...where there is a ton of data on Asian facial structures, etc, for
example.

~~~
Traster
It's not just a problem of lack of data, it's a problem of composition of
data. If there's a signal that increases your accuracy for asian faces and
decreases it for caucasian faces, in systems deployed in China the system
weights will be adjusted one way, in systems deployed in Europe you'll get the
other way, _and_ Asians in Europe and Caucasians in China will get bad
performance. Maybe people don't care about that, but wait until that
performance difference is between black and white people in America.

------
unnouinceput
Quote: "However, facial recognition does not seem to have been making the
company much money, if any."

Right there it's the real reason why. Rest of IBM's blah blah is just dust in
eyes.

~~~
refurb
Never let a crisis go to waste. Why just cancel a program that's losing money
when you can cancel the program and tell everyone it's because of your new
found belief in social justice.

~~~
smitty1110
"A crisis is a terrible thing to waste." \- Paul Romer

I dust this quote off far too often for my liking, but it fits.

------
kohtatsu
Honestly if I were a POC the last thing I would want is accurate facial
recognition.

FaceID will work regardless thanks to depth maps.

I can't see any other use of the technology benefiting me.

------
CivBase
What is he referring to when he mentions "racial profiling" in the context of
facial recognition? Is it just a disparity in training data?

~~~
casefields
Hasn't there been a bunch of goof ups with it having trouble with black faces?

[https://www.wired.com/story/best-algorithms-struggle-
recogni...](https://www.wired.com/story/best-algorithms-struggle-recognize-
black-faces-equally/)

Google Photos tagged blacks as gorillas:
[https://www.forbes.com/sites/mzhang/2015/07/01/google-
photos...](https://www.forbes.com/sites/mzhang/2015/07/01/google-photos-tags-
two-african-americans-as-gorillas-through-facial-recognition-
software/#3c90a334713d)

3 years later they still hadn't solved it:
[https://www.theverge.com/2018/1/12/16882408/google-racist-
go...](https://www.theverge.com/2018/1/12/16882408/google-racist-gorillas-
photo-recognition-algorithm-ai)

------
ngcc_hk
Issue real. But with one not so successful vendor out, what does it meant?

Also, with china here and like many technology could you not do it china would
give up as well? Is it 2016 when the coronavirus study join with a us u stop
due to concerns but not by china (Wuhan lab) that give them a lead? Or human
genetic HIV research on baby?

No good solution but I think just quit is not the one for a potential human
rights related technology research or area?

------
MintelIE
Only AFTER they helped China develop the technology to racially ID their
Muslims. And IBM isn't alone, many companies have helped China to efficiently
fill its death camps.

~~~
justusw
Do you have any sources on that?

------
tootahe45
IBM failed to compete.

------
bparsons
Give IBM a break. It isn't as if it would have had some experience in the past
suggesting that its technology could be used for monstrous ends.

------
ngcc_hk
This might be a strange argument but look at
[https://news.ycombinator.com/item?id=23459963](https://news.ycombinator.com/item?id=23459963)

The key is to have open data and not let Chinese to have world data whilst
they close their it and data for themselves. Shut down an area and let china
lead is not the human right answer. You need to force them to join the world
in a meaningful way. We cannot study photos or things like that like study
Soviet Union politics.

Just too danger to leave and let china to win. All in and ensure the
technology be used in an open And censurable manner.

------
microdrum
They're just behind. And the ACLU "bias" study was thin and unscientific. Data
sets and weights could have bias, but that bias can also be controlled.
"Facial recognition" does not have bias.

