
Apple Face-Recognition Blamed by New York Teen for False Arrest - computator
https://www.bloomberg.com/news/articles/2019-04-22/apple-face-recognition-blamed-by-new-york-teen-for-false-arrest
======
gregoriol
Some people here are saying that this is bad because the victim is asking 1bn,
that it's a publicity stunt.

Well, it better be a publicity stunt! You can't change those stupid behaviors
of facial recognition without such bad publicity. I hope they make as much
publicity as possible, and maybe some money (but he won't get 1bn, that's not
the point).

~~~
saagarjha
> he won't get 1bn, that's not the point

That was _my_ point, though. He’s abusing the legal system by asking for
compensation he knows he will not receive to gain publicity. The issue
involved is fine but I’m not a fan of exploiting and clogging up the legal
system in this way. If this is “what it takes” I think we have a bigger
problem we need to solve.

~~~
nullc
How big does the penalty need for it to actually make avoiding it in the
future a priority for Apple?

Putative damages need to scale with the size of the defendant or they're
meaningless.

~~~
saagarjha
As far as I can tell, this is strictly supposed to be compensation for
hardship. The size of the entity that is causing the hardship doesn’t affect
the amount of suffering. So why should the payout be different? (Otherwise I’d
just seek to be wronged by the organization with the biggest pockets?)

~~~
kuzehanka
If we keep letting a corporate entity get away with undesirable practices by
merely slapping them on the wrist, we are actively signalling to them that
they should continue to engage in those practices.

The only way to stop a corporation from doing something undesirable is to make
the punishment great enough that the undesirable act is a net financial loss
to the corporation.

I would personally fully support massive % of yearly revenue fines for
anything to do with privacy.

~~~
saagarjha
Work on making it illegal, then, so that corporations can be properly fined
instead of random citizens having to justify billions in damages.

~~~
salthound
I don't understand this. Work on making what illegal? Facial recognition? Or
breach of duty? Or making mistakes?

It seems to me that we already have the proper way of dealing with this
situation. Have the company pay punitive/exemplary damages.

~~~
saagarjha
Actually, that brings up a good point: what exactly should the remedy in this
case be? I don’t think it’s clear that Apple is even at fault here. It’s a
pretty shaky case and I’d place more blame on the officers who arrested him
without double-checking than an in-store camera.

------
astatine
I know this view is not going to go down well with the HN crowd, but consider
this: The name used by the actual thief was used to track down Bah and arrest
him. The normal course from there would be a long haul process to prove he was
innocent. But he was _exonerated_ by the presence of facial recognition which
showed that the person in the store was not Bah. The facial recognition here
seems to have done neither more nor less than a regular CCTV camera in a
store. If anything the presence of the cameras helped free him!

~~~
thudrs
This article was posted below:
[https://www.insurancejournal.com/news/national/2019/04/23/52...](https://www.insurancejournal.com/news/national/2019/04/23/524414.htm)

The facial recognition prevented Bah from recovering his account. He was
accused, and I believe arrested, multiple times. Despite being proven innocent
time and time again.

If I understood this article correctly, the thief did not always sign in/show
an ID. Sometimes the facial recognition just tied the man to the stolen ID.
The CCTV did help prove his innocence, when it wasnt misteriously missing, but
the facial recognition, was in fact a problem.

------
zxcvbn4038
He should ask for a billion dollars - the article says that Apple’s mistake
resulted in his arrest, and even though it was a mistake and he was not
convicted, I don’t think that arrest is ever going to be removed. He will
forever be someone who was arrested and it will show up if someone does a
background check. The police might pull him over for a minor traffic
violation, the computer will pull up that the occupant has been arrested for
theft in the past, suddenly its a whole different conversation because the cop
is looking for an easy arrest.

~~~
Veen
If Apple had circulated a photocopy of the possible perpetrator to its stores,
to be shown to staff and taped beside the POS, and a member of the staff
recognized him and called the police, would that be a billion-dollar mistake
too?

I think the facial recognition is a red herring. What matters here is that the
perpetrator stole someone's ID and used it during the crime. When the owner of
the ID entered the store, he became a suspect and was arrested. Shitty for him
but perfectly understandable.

~~~
Jolter
He never entered an apple store. He was arrested at home due to Apple giving
his name to the police.

I know it's a nitpick but you come off as not having read/understood the
linked article.

~~~
Veen
The linked article says nothing about where he was arrested.

(The NY Post article that is linked to in the linked article mentions that he
received a summons in the mail and was later arrested, but it doesn't say
where he was arrested. I concede that it wasn't in the Apple store, but there
is no indication of that in the original article))

~~~
Ajedi32
The linked article clearly says:

> Ousmane Bah, 18, said he was arrested at his home in New York in November
> and charged with stealing from an Apple store

Don't know how you missed that, unless something got edited.

~~~
Veen
I’m confused. Has the linked article been changed from this Engadget article
to the current Bloomberg article, or am I suffering from some sort of brain
fart?

[https://www.engadget.com/2019/04/23/apple-facial-
recognition...](https://www.engadget.com/2019/04/23/apple-facial-recognition-
false-arrest-lawsuit/)

EDIT: According to this comment, they have switched the link. It’s a bit
misleading to change the link once a substantial conversation has already
happened based on the original link.

[https://news.ycombinator.com/item?id=19726631](https://news.ycombinator.com/item?id=19726631)

~~~
Ajedi32
Yes, mods can edit posts after the fact. Unfortunately there's no edit history
on HN so it's difficult to tell what happened unless someone says something.

------
MarHoff
What's bothering me the most is that he claim facial recognition is involved
while no evidence at all is provided in the body of the article.

Look like a plain old identity theft with the ID card he lost... As long as
they recon his innocence after "humanely comparing" his face in a police
station with video footage what would actually worth 1 Bn$ ???

A few year ago a coworker of mine was often contacted by police because his
car plate was used by a matching stolen car... It was embarrassing but he
didn't sue the car-maker as far as I know...

------
sagitariusrex
It's disappointing to see that the 1 billion figure is what really grinds
people's gears in here. That's the thing you should be the least concerned
about.

edit: typo

~~~
TeMPOraL
It's probably because it's the most objectionable part of this, so it's the
main target for nitpicking. The "Apple's responsible for a fuckup of their own
product" isn't controversial, and I don't think "fuckups of ML used for law
enforcement are a serious thing" is controversial either.

------
emiliobumachar
Let me echo onion2k's point:

That kid can no longer truthfully answer "no" to "Have you ever been
arrested?". This alone can change his life completely.

~~~
kowdermeister
Is that bad in the US? Nobody cares if it was because of a mistake?

~~~
doorbellguy
Yes, it changes a lot of dynamics including his future employment.

~~~
OJFord
If you're arrested, but not charged, and have no criminal record?

~~~
justin66
It's complicated in the United states, federal law (FCRA and employment law)
and state law are both involved.

Since this teenager is in New York, this arrest record (which will not be
publicly available forever, and could be sealed immediately by court order)
should not be a factor in their future interviewing in that state.

[https://www.nolo.com/legal-encyclopedia/new-york-law-
employe...](https://www.nolo.com/legal-encyclopedia/new-york-law-employer-use-
arrest-conviction-records.html)

 _Employers may not ask about or consider arrests or charges that did not
result in conviction, unless they are currently pending, when making hiring
decisions. They also may not ask about or consider records that have been
sealed or youthful offender adjudications._

------
nerdbeere
I get some strange kill decision[1] vibes here.

So they fed a machine with flawed data, got a result and the police acted on
that without reviewing the "evidence" before hand?

[1] [http://daniel-suarez.com/killdecisionsynopsis.html](http://daniel-
suarez.com/killdecisionsynopsis.html)

~~~
Shivetya
Well I blame Apple for this. Why? Because they obviously had video of the real
thief and there is no excuse that they could not have compared the results of
the two before notifying the police let alone have the system immediately of
facial recognition know they aren't the same people visually.

If anything I want to know, does the store record anything outside its
physical interior? What is the retention of this data. What is the process and
safeguards before a situation like what just happened? Who makes the final
decision? Are your images tagged each time you use your ID for any engagement
within the store? Heck, do they even post notices they do this?

For a company that touts privacy they really blew it and in a spectacular way

~~~
nerdbeere
> Well I blame Apple for this.

Sure, they made mistakes there. But I also blame the police here. They
shouldn't blindly act on accusations just because they came from big
corporation.

> For a company that touts privacy they really blew it and in a spectacular
> way

Exactly! It's a little sad to see that they blew it like this. With their
focus on privacy they really have something that separates them from google.
I'm curious how they will react to this.

------
trevyn
Possibly a better article, including a reference to the actual filing:
[https://www.bloomberg.com/news/articles/2019-04-22/apple-
fac...](https://www.bloomberg.com/news/articles/2019-04-22/apple-face-
recognition-blamed-by-new-york-teen-for-false-arrest)

~~~
jfk13
From this account, it's hard to see how Apple is at fault. Someone apparently
may have used his lost (non-photo) ID to identify themselves in an Apple
store, with the result that Apple associated his name with theft. They
reported this to the police. Is that so unreasonable? Seems like it's a valid
case for further investigation.

When the police showed up at his home with an "arrest warrant [that] included
a photo that didn’t resemble Bah" they should have immediately figured out
something was wrong; that perhaps they've got the wrong guy? As such, they
probably shouldn't have followed through with the arrest at all, pending
further investigation, unless they had reason to believe the individual -
despite not matching the warrant - presented an immediate threat. Which sounds
unlikely.

So ISTM that the parties at fault here are (a) whoever fraudulently used his
lost ID, thereby falsely associating his identity with criminal activity; and
(b) the arresting police who did not pay attention to whether the subject
being arrested matched the warrant they had in hand.

But while I am still deeply uncomfortable with the widespread deployment of
facial recognition, etc., and would like to see it restrained, I don't see
that it's really the problem in this case as reported.

~~~
zimpenfish
> they should have immediately figured out something was wrong

Even if the warrant had looked sufficiently like him to confuse people, it
should have been a quick conversation about if he'd lost any ID, what kind,
where, and then where he was on certain dates to clarify that he wasn't the
thief.

Do US police have to arrest someone to question them?

~~~
sigzero
No, they don't. They could have questioned him at his house. He ignored a
summons though, so it probably didn't work it his favor.

------
thudrs
So thief walks in the store with the stolen ID. If facial recognition is
working correctly, then this identity theft should have been recognized
immediately, unless the actual guy has never been in an Apple store.

Thief steals a bunch of things, and the software ties the thiefs face to the
stolen ID. Without facial recognition, they still would have tied the ID to
the theft.

Apple sends the police the name and address on the ID, as well as maybe an
image of the thief. The police arrest the man matching the ID without
reviewing the footage.

This seems like the polices fault for not following through with a proper
investigation. If anything, the footage from Apple saved this teenager. He
still has an arrest on record, and as others mentioned that could ruin his
life. But he needs to take that up with the police, not Apple.

The only way I see this involving facial recognition is if the thief came in
to case the place, used the stolen ID, then came back later without showing
the ID and stole a bunch of things. Or if they have software that flags
thiefs, even when employees dont see it.

Now all that being said, there are questions to be raised about the
surveilence brought by Apple. Do they notify their customers that they are
being recorded? How long do they store this data? Are they analyzing your
behavior (frequency of visits, suspicious activity, where you walk in the
store, etc)? Is their facial recognition biased/inaccurate?

Apple prides themselves on being privacy focused, but they are clearly
infringing on their customers.

------
kbutler
Better article:

[https://www.insurancejournal.com/news/national/2019/04/23/52...](https://www.insurancejournal.com/news/national/2019/04/23/524414.htm)

My conclusion: The face-recognition functioned completely correctly - it
associated multiple acts by the same individual - but Apple incorrectly used a
non-photo ID as identification of that individual, and thus incorrectly
identified that individual as Bah.

Sequence of events summary:

Thief used Bah's stolen, non-photo ID.

This led to Bah being arraigned for theft.

Apple then identified video of the thief in multiple thefts at other stores (a
detective asserted the matching was done by face-recognition).

This led to Bah being arrested.

Police then realized that Bah did not resemble thief, he was released, and
various charges have been dropped.

------
ourcat
The most chilling thing about this is that any stores even have facial-
recognition systems in place.

If I'm going to walk into any physical store with that technology inside, I
want to know that BEFORE walking in.

~~~
duxup
Apple says they don't have facial recognition in their stores.

I don't know how true that is, but I think it is worth mentioning.

~~~
sigzero
I think there is a lot of assumption there is because the article says it. I
do not believe they have facial recognition in their stores.

~~~
duxup
Anything is possible, but I also sort of would be surprised if Apple would
have deployed facial recognition.

------
KenanSulayman
... but where does the $1 billion damage come from?

~~~
zimpenfish
One assumes "to make sure it makes it into the news to try and force/shame
Apple into a quick settlement rather than taking it to court where it'll
probably get tossed."

------
pts_
>Boston theft, where $1,200 worth of goods were stolen,

So like 1 Apple phone.

------
cwingrav
Cameras can be used ethically in public and semi-public areas. While I don't
know if we have official/specific guidelines in the US, I'm sure with a little
effort we could create some that would allow for their very effective use, and
not overreach, violating personal rights or leading to a police state.

In this case, it seems a mix up occurred and a little police work easily led
to clearing the person's name of the charge. Looks like the system worked (but
it was very inconvenient for the accused). The camera is just a single piece
of evidence, and not the entire case. So long as courts realize this, it
should actually help deter crime AND make prosecution more correct in catching
the correct perpetrator. There is a long history of just finding someone to
blame. I hope cameras make catching the correct person easier, to avoid this
type of policing failures.

~~~
maxheadroom
> _...it seems a mix up occurred..._

The "mix-up" occurred because there was an ID with no picture on it. So, Apple
_assumed_ it was him - with no corroborating evidence to that. They then
flagged their AI to look for him, not comparing it to pictures/video of the
actual perpetrator, and called the police when he came in.

> _...and a little police work easily led to clearing the person 's name of
> the charge_

True but there's a couple of problems with this:

1) He can never answer "no" on a questionairre of whether or not he's been
arrested before. Depending on how strict employers are, this could prevent him
from even being considered for employment.

2) There's an actual potential he lost his car, job, place to live, etc. while
waiting to be cleared. We don't have a time-frame for this, so it's anyone's
guess, but it's a safe bet that his employment, alone, wouldn't have been safe
(e.g.: no call, no show).

> _I hope cameras make catching the correct person easier, to avoid this type
> of policing failures._

The policing failure,as you refer to it, was actually a failure on Apple's
part and not the part of the police.

~~~
Jolter
> They then flagged their AI to look for him, not comparing it to
> pictures/video of the actual perpetrator, and called the police when he came
> in.

Nope, that's not what happened. The article is poorly written, but read it and
it might make sense.

" However, since the ID didn't have a photo, the lawsuit states Apple
programmed its stores' face recognition system to associate the real thief's
face with Bah's details"

In other words, the only thing Apple used facial ID for was to link the
thief's face to thefts where the thief didn't show ID. That allowed them to
report the wrong person for multiple thefts (presumably instead of just one).
(Yes it's terribly unclear from the article.) But it's fairly clear the
wrongly accused was not arrested at the store, because the face recognition
was primed for the real perp's face.

------
duxup
>the lawsuit claims Apple programmed its stores' face recognition system to
associate the real thief's face with Bah's details. In a statement to
Engadget, an Apple spokesperson said the company does not use facial
recognition in its stores.

So anyone know how likely it is if that thing happened or not?

------
monkin
Facial recognition systems are everywhere, even in Poland small
shops(FreshMarket, etc.) have it built into cash registers.

Is this something that concerns me? Absolutely not and I can even say I don't
give a fuck. Most of the government institutions already have your photo,
signature, and many more from ID/Passport or any other government documents
you made. So why you should be afraid of companies who does it too? I know, I
know, privacy, ads, tracking and stuff. :)

But is this still fight for privacy? I see many startups us this rhetoric to
acquire clients. But it turns into a terrible joke like it was with usability
15 years ago. Where companies offered services that were not needed to collect
even more money.

------
lawlessone
>the lawsuit states Apple programmed its stores' face recognition system to
associate the real thief's face with Bah's details.

What moron thought that was a good idea?

~~~
zimpenfish
What else could it do? The thief presented Bah's details. What details should
it have associated with his face?

~~~
ryanlol
Maybe they could’ve looked at the face on the ID?

~~~
zimpenfish
The articles specifically point out that it was non-photo ID presented by the
thief.

~~~
ryanlol
What on earth is a non-photo ID? Like... a credit card?

------
mcculley
Apple has no power to arrest anybody. The local police should be sued. It was
the local police who chose to use Apple's bad evidence.

------
throw03172019
> ...where $1,200 worth of goods were stolen, took place.

Did they only steal ONE iPhone X?

~~~
viraptor
Or 12 USB adapters.

------
diNgUrAndI
Shouldn't Apple put up warning signs at their store to say that you are under
surveillance cameras?

------
rdruxn
I don't understand what role the facial recognition played in his arrest. The
article says the actual thief used his name and other identifying info - it
makes no claim about how the facial recognition software was utilized to make
the arrest...

------
OrgNet
Apple is pulling the swatting prank now? Don't they know that people can get
killed by trigger happy cops?

------
JoeAltmaier
And I'll sue the grocery store bagger for breaking my eggs. Does $2B sound
about right?

------
saagarjha
> the lawsuit argues that Apple's "use of facial recognition software in its
> stores to track individuals suspected of theft is the type of Orwellian
> surveillance that consumers fear, particularly as it can be assumed that the
> majority of consumers are not aware that their faces are secretly being
> analyzed."

This, coupled with the $1 billion being asked, makes this seem like a
publicity stunt :/

~~~
onion2k
Calling this suit a publicity stunt trivializes it and makes it sounds like
there's no value beyond getting some publicity. I don't think that's true.
This is using the legal system for privacy activism in a positive way. Making
corporations understand that there's potentially a _very_ real cost to acting
on false positives could be a good thing that actually moves the needle on our
right to privacy.

Just imagine if stores knew a false accusation based on nothing more than
facial recognition could cost them millions. They'd have to put in more effort
than "The computer said you're a criminal". That would be great.

~~~
ezoe
True. The apple's false accusation resulted wrong arrest. Apple is responsible
for it.

~~~
zimpenfish
> The apple's false accusation resulted wrong arrest.

But it was only false in hindsight - someone came in, used his ID, and their
face was associated with it - that's perfectly sensible and normal behaviour
from their systems.

> The arrest warrant included a photo that didn’t resemble Bah

That part is definitely on the police, not Apple.

~~~
logfromblammo
It is not perfectly sensible and normal behavior to demand ID from customers
and then set up a secret tracking profile, based on facial recognition and
linked to that ID, that is apparently linked across all the stores for your
company.

The buried lede is that everyone who has ever walked into any Apple store is
being tracked by Apple in _all_ Apple stores, and possibly also anywhere else
in public. The false arrest allowed that info to leak to the public.

~~~
zimpenfish
> everyone who has ever walked into any Apple store is being tracked by Apple
> in all Apple stores

That hasn't been confirmed yet, has it? I might have missed some extra info
but the original story only mentioned they were tracking that ID because of
the thefts.

> based on facial recognition

Has that been confirmed yet? I may also have missed that but again the
original articles only had that as a claim from the man suing them, nothing
concrete (and I believe Apple also denied it?)

~~~
logfromblammo
Apple can keep all the surveillance footage from all their stores.

Apple can run FaceID on any image they have.

In this instance, Apple must have run FaceID from images from the recordings
that show the thefts. It is not necessary that they did this on-site or in
real time.

A statement by Apple claims that they do not run facial recognition software
in their stores. But the wording was very specific; they did not deny
uploading the in-store video to an offsite server to detect faces and identify
them. They certainly did it for one person. They could do it for more, limited
only by their computing power. And they are a computing hardware company.

~~~
zimpenfish
> In this instance, Apple must have run FaceID from images from the recordings
> that show the thefts.

But again - we have zero proof of that.

> They certainly did it for one person.

They might well have done but nothing in the reporting of this story has any
kind of evidence that this happened!

------
otterley
Source article: [https://www.bloomberg.com/news/articles/2019-04-22/apple-
fac...](https://www.bloomberg.com/news/articles/2019-04-22/apple-face-
recognition-blamed-by-new-york-teen-for-false-arrest)

------
jfk13
I'd be more likely to look at this article if it wasn't hidden behind the
byzantine and user-hostile Oath "GDPR" gateway.

~~~
r3bl
Then maybe you should check a source that's not owned by Verizon (that also
owns Oath)?

I'm pretty sure that there are other sources posted in this thread.

------
fit2rule
The $billion figure is clearly designed to draw attention to this case. Would
it have made it onto HN if he'd been suing for $1000?

And, rightly so, in my opinion. This is the tip of a totalitarian iceberg -
would any of us be content with being declared criminal by a computer?

