
Facewatch 'thief recognition' CCTV on trial in UK stores - SimplyUseless
http://www.bbc.co.uk/news/technology-35111363
======
GordonS
> We later discover my thick-rimmed glasses are the problem. It's something
> easily fixed by adjusting the system settings, Simon tells me.

Facial recognition is a hard problem; allowing for thick-rimmed glasses is
_not_ 'easily' fixed.

~~~
delinka
Well, you can't admit that to a reporter/the public...

------
falcolas
So, how would a reformed thief go in to buy basic goods once this becomes more
widespread?

Do we no longer believe that someone can pay off their debt to society and
become a useful member of said society?

~~~
TeMPOraL
We're talking here about thieves that hadn't been caught by the Police.

Maybe the way to handle it could be that if you're a thief and had a change of
heart, report to the local Police station and agree to serve your time in
exchange for getting your face removed from the database.

~~~
falcolas
The database only has convicted thieves in their Database, so in theory
they've already served their time for their crime. The problems is that they
are then put on an apparently un-auditable list maintained by a private
company, and discriminated against.

~~~
DanBC
> The database only has convicted thieves in their Database,

From this article:

> and other potential offenders

And from this article about a different system:

> Police forces are already using face-recognition to identify people, much as
> they might with DNA or fingerprints.

> A national police database contains up to 18 million "custody suite" photos
> - hundreds of thousands could be of innocent people.

Many of those people will be those arrested for an offence, but not cautioned
or tried. (To be clear here: they really didn't do the crime, not just they
weren't convicted of a crime).

~~~
falcolas
Thanks for the clarification. Not a point in favor of the system, though.

------
pavel_lishin
I wonder if the CV Dazzle[1] style will take off in the UK. Maybe we'll
finally get the exact cyberpunk future promised us by Gibson, etc.

[1] [https://cvdazzle.com/](https://cvdazzle.com/)

~~~
bsenftner
No need to go that far: every system I'm aware of fails when there are no
eyebrows. (I'm in the FR industry.)

~~~
fit2rule
Your eyebrows must be [<\-------- this -------->] long to enter this
store/view this content.

------
davnicwil
I wonder if places using systems such as this will have to declare it plainly?

If they do, seems like there might be some backlash, with people preferring
not to go to businesses starting down the path of using this sort of tech and
spending more time in those which could, but choose not to, use it.

Assuming of course it doesn't end up just completely ubiquitous like web ads
etc, the market _could_ force tech like this into the high-risk margins, i.e.
high-end jewellery shops etc, rather than it eventually popping up in every
pub and high-street shop.

I will say though, having written that, I do think it's likelier it'll just
end up ubiquitous.

~~~
cjrp
Some pubs/clubs already require things like fingerprints to be taken before
going in, it doesn't seem to stop people going there. If anything, something
which is less effort for the customer (like the Facewatch system) would
probably be preferred

~~~
davnicwil
It won't stop everyone, some people don't care, might be socially pressured to
just going in anyway in a group, etc. But for customers who do care, it could
actually work in the opposite way too.

If a club wants to take your fingerprints, it's usually after you've already
entered the place, having possibly queued, at a point where it's very annoying
and socially awkward if in a group to about-turn and say "you know what, no,
let's go elsewhere".

If on the other hand there's a sign outside declaring "we use Facewatch here"
then it's actually much easier for customers for whom that represents a
problem to just continue walking past, and not be drawn in in the first place.

It may be that simply not enough people care about having their biometric data
read and stored in databases etc, or that it'll be so ubiquitous that there's
no alternative but to go along with it. But like I say, it's possible, if
either of these things isn't true, that the market will suppress technologies
like these becoming 'just the way things are'.

Again though, I honestly don't think that is likely looking at what's happened
with web tracking, CCTV and the like. It's possible this, however, is a bit
more 'in your face' (pun intended) and will spur a greater negative reaction
in people.

------
agd
Widespread facial recognition is going to add a whole new slew of data for
GCHQ/NSA to data mine. Not to mention advertisers tracking exactly where you
walk in shopping malls and shops.

From my understanding, UK law requires that you be informed if subject to
facial recognition, but, analogous to online tracking/advertising, does not
require your permission to begin it.

Sigh, _gets out my rimmed glasses_

~~~
Someone1234
> Widespread facial recognition is going to add a whole new slew of data for
> GCHQ/NSA to data mine.

Less than you think.

We're all walking around with beacons in our pockets already. They already
have the ability to "follow" a person to within a square mile, and that
individual is far more knowable (i.e. less false positives) than facial
recognition.

Between IMEI spotting and license plate readers surrounding central London,
they have a real time SimCity-esk map of the city. The facial recognition
stuff might be the cherry on top, but it is hardly going to be a major gain.

PS - And as far as I know they already have facial recognition cameras as most
major tube stations.

------
a3n
What happens to a face after it's in a database, or multiple databases, gets
arrested, spends time in jail, and then goes straight? Will it be as
impossible to get off these databases as it is to get off the no fly list?

~~~
bsenftner
And that has nothing to do with FR and its technology. The contents of an FR
database and getting items removed is purely human politics.

~~~
a3n
This technology, deployed, is without a doubt a political issue. Divorcing
what we work on from how it's used is irresponsible. And I'm not saying don't
develop technology, but we have to recognize and consider how it fits into
society. We have to discuss.

The Stasi recruiting human spies from among the general population was both a
logistical question and a political issue. The NSA weakening encryption, or
parking on backbones or datacenters, is a technology and a political issue.
The political urge to inflict both kinds of spying is just that, a political
one. As technology improves, it makes the job of spies and other criminals
that much easier, increasing the spy's reach, and changing the character of
society.

The technology is fascinating, but it's not divorced from society. We're all
swimming in the same soup.

------
DannoHung
I look forward to privacy conscious individuals taking steps:
[https://cvdazzle.com/](https://cvdazzle.com/)

------
fredley
This does not scale. It might work for a few people in a population of
thousands, but beyond that false positives are going to be a real problem.

~~~
bsenftner
I work in facial recognition. When you say it does not scale, how so? I have a
system with 975K people in it, the US registered sex offenders database, and
with a single quad core laptop a lookup takes about 8 seconds. With a server
class 32+ core system the lookup is nearly real time. How does that not scale?

~~~
ultramancool
I assume he more meant accuracy at scale - if you have a large population how
does the accuracy do? Do you wind up with many close samples or are things
pretty good?

~~~
bsenftner
Pretty much all FR systems generate a list of matches, ranked from closest
match on down. The size of the list is configurable. It is also industry
standard NOT to use the list as an authority. Given a list of high matches
(greater than 90%) a human can quickly filter out obvious false positives, and
then the remaining are retained for further consideration. It is also industry
standard NOT to rely on FR alone; combining FR with other measures reduces the
ability for a false positive.

The issue I find with FR is people expecting it to be some super technology,
gleaming significant information from a dust spec. Its not like that. And the
media is playing it up with unrealistic descriptions. Remember the original
DOOM and it's quality of graphics? That is where FR is now. We've got a ways
before the journalist hype is close to reality. And the mature technology will
not be FR, but a comprehensive multiple biometric measuring system capturing
far more than someone's face.

~~~
Retric
So, if you’re looking for 1% of a total population your false positive rate
before human intervention is 99%. That's not useful for automated systems.

Further, identical twins are going to show an image that would fool human
verification making this questionable for many tasks even with human
supervision.

PS: Almost 1% of the population has an identical twin. (% of births is lower,
but you get 2 twins.)

~~~
bsenftner
There is only one true match. So the system will always be generating false
positives. This technology is not an authority, but a filter. Yes, identical
twins will both be identified if they are both in the system and no additional
biometric measures are included. Identical twins still have different retina,
and due to lifestyles identical twins beyond age 30 can be distinguished apart
fairly easily.

------
rmc
I wonder what the data protection implications of this are? I presume there
are many. You're storing personal information on people, possibly transfering
it to another businesses, analysing it and using it to make decisions....

------
pdkl95

        <span style="color: #FF2400">A</span>

