

Homeland Security moves forward with 'pre-crime' detection - Suraj-Sun
http://news.cnet.com/8301-31921_3-20117058-281/homeland-security-moves-forward-with-pre-crime-detection/

======
DanI-S
It's nice to see America's proud tradition of racial profiling being brought
forward into the 21st Century. Go, science.

The deployment of this technology should not be acceptable to anybody who
believes in our right to live in a free society of equals.

~~~
dkokelley
How is this racial profiling? I accept that it _is_ profiling. But where and
how does race come in to play?

~~~
DanI-S
Direct quote from the article:

But where "Minority Report" author Philip K. Dick enlisted psychics to predict
crimes, DHS is betting on algorithms: it's building a "prototype screening
facility" that it hopes will use factors such as ethnicity, gender, breathing,
and heart rate to "detect cues indicative of mal-intent."

~~~
dkokelley
I missed that part on my first read. Rereading it makes me think that
ethnicity would be used along with age, gender, height, and weight to find a
baseline for the 'indicative' factors like heart rate. I agree that once any
genetic factors are used as indicative factors we have crossed a dangerous
line. I sincerely hope that this phrase in the article was simply worded
poorly.

~~~
StavrosK
It's much more likely that these inputs will be used as features to train some
ML algorithm, rather than that they will actually be used to say "if black
then detain".

~~~
JoshTriplett
The limitations of machine learning aside, this could actually work quite well
if trained properly, because it wouldn't raise the questions that human
choices do about bias. Given sufficient information, a machine-learning
algorithm can make decisions based on factors that humans would refuse to
consider or would intentionally err on the side of _not_ considering.

~~~
StavrosK
Yes, of course. It could also disregard features completely if it has found
that they make no statistical difference.

~~~
JoshTriplett
Exactly. Most importantly, it can focus exclusively on statistics, rather than
guesses.

~~~
woodson
One problem I see with this is that these statistics involve a lot of human
bias. If you train your classifier on crime statistics, it's not based on a
'ground truth' (which is never attainable in a criminal case), but an
expert's, i.e. judge's/jury's, decision.

~~~
JoshTriplett
That would indeed produce incorrect training for a classifier of "who should
we pull out of line for a search". For that, you'd want to train exclusively
on the observable characteristics of the people pulled out of line for a
search versus whether you found something.

------
dkokelley
I'm a little disappointed with the response here. This isn't "racial
profiling".

 _FAST is designed to track and monitor, among other inputs, body movements,
voice pitch changes, prosody changes (alterations in the rhythm and intonation
of speech), eye movements, body heat changes, and breathing patterns.
Occupation and age are also considered. A government source told CNET that
blink rate and pupil variation are measured too._

This technology should only be used to say "look here" to assist the people
behind the technology. Ideally, this will help a few screeners identify people
in a crowd who are behaving abnormally. This 'tech' has been used in security
for ages. Only earlier we called it "I have a funny feeling about that guy"
and now we can say "this man is suspiciously nervous, and here's the science
to show it".

I don't want to get political, but the politically correct method of pure
random screening in airports is horribly ineffective. Security professionals
in unstable regions know this, which is why they are trained to look for the
exact things this device is looking for.

The danger of this technology isn't that it will start issuing arrest warrants
based on probable intent. The danger is that the people behind the technology
will get it wrong. History shows that we don't need technology to help get it
wrong.

Please don't confuse this technology with political positions. Has the TSA
gotten out of hand? Sure. Are our rights being violated? Probably in many
cases. Does this technology have anything to do with that? That only depends
on how it's used.

~~~
jambo
The real problem is that it presupposes thought-crime. The DHS document in the
sidebar says FAST will "identify individuals with malintent ... *Malintent:
the state of mind of individuals intending to cause harm ..."

Whether it's really scientific, I doubt we'll ever find out. But people will
be more likely to believe it's scientific because it comes out of a box that
uses 'algorithms'.

Another problem is one that Schneier writes about: profiling leaves
vulnerabilities that can be probed, discovered, and exploited. In the case of
trying to sense malicious intent, it could be sending someone who doesn't know
they're an attacker. Or someone who doesn't exhibit the outward signs that are
being profiled.

~~~
dkokelley
I agree that a system like this could be probed and exploited. The real issue
comes up after someone is flagged as suspicious. Is this person arrested right
away? Is this person subject to additional screening? It would violate basic
human decency to arrest someone for appearing suspicious, regardless of
whether or not he or she was 'scientifically suspicious'. Taking a closer look
at suspiciously behaving individuals doesn't seem to cross that line for me.
We've been doing that for years. The only difference now is that we have a
machine to help scale that effort.

~~~
IgorPartola
The other problem is that there is a whole grey area between "let the guy go,
he isn't doing anything" and "tackle and arrest him now". The person can be
detain, held up, put in a different queue, screened, probed, strip searched,
etc.

Given TSA's record so far, it will result in various situations which violate
people's rights without ever arresting them.

I may be wrong, but it seems to me that this tech, just like the metal
detector and the porno scanner, are just ways of giving less-than-expert staff
the ability to perform expert-like security screenings. In other words,
someone with a two week training course in how to fire up this magic box is
now charged with making the determination of whether each person passing
through their station needs to get a secondary screening (of various
invasiveness). What is this employee to do when the machine tells him, "we've
got to strip-search this person now" and the person is a 95 year old woman in
a wheel chair? How about if it's a 5 year old?

Maybe we need to give the TSA security more training/hire more professional
security staff that would use years of experience rather than magic tools to
determine who could be dangerous. Or do what the is done in Mexican airports:
you push a button and if it comes out red you get your bags searched; if
green, you can proceed. Random number generator FTW!

------
bediger
The only thing DHS could do to make their image worse is to buy uniforms with
shiny boots and brown shirts for all their employees.

~~~
sneak
Consider for a moment that even if they did this tomorrow and announced it on
the front page of the Times, people would continue to live in America, pay
taxes, and take commercial flights.

The time to leave was some time ago. The first, second, fourth, and fifth
amendments are all gone, now.

~~~
forensic
People actually _look-forward_ to dystopian big brother futures like Minority
Report now.

If you tell someone, "The world is looking like Minority Report" they go
"Cool!"

~~~
anamax
Of course they do.

MR-world has cool cars, is clean, the govt seems to have only one flaw, and
Tom Cruise will wander by and save the day.

------
softwaregravy
Least likely organization to pull this off in the world -- other than my
grandmothers knitting group, but they have enough sense not to try doing
things they will surely fail at.

------
plasticky
Wonder how many false positives this would generate at the metal detectors in
a Wal Mart.

------
dataminer
These "precautionary" measures and profiling databases are being developed in
most parts of the world. Governments are becoming more and more intrusive in
personal lives in the name of security. I really wish someone would buy some
land or islands and create a modern country based on libertarian principles,
it will be an interesting experiment/startup, although I am not sure if you
can start a country just by buying land.

~~~
VladRussian
Governments consist of the same people like you and me. Your imaginary "modern
country based on libertarian principles" would consist of the same people and
it will look the same (if not worst) after short time. Problem isn't
government. Problem is "we, the people".

(my old country was the leader in implementing "modern country based on best
principles known to the humanity at the time". The main mistake people do when
look back is blaming the principles, saying that these principles were bad and
these are different and better. Nope. After all, the best principles of
yesterday - communism and fascism - looked very different (with each claiming
to be the best), yet both produced the same result, and today we do understand
how much they were the same. The best principles of today (libertarianism
according to you, Sharia Islam according to several hundred millions of some
other people, ...) will be superseded by the best principles of tomorrow. It
isn't the principles that kill millions of other peoples and make lives of
other people nightmare. It is other people who do it. It isn't modern country
that we need, it is the modern people. For example, people who just wouldn't
compile the big database, wouldn't run the analysis and wouldn't issue the
order to act upon the result of analysis. It is just that simple and just that
hard. )

------
alexsb92
What would happen to something like this, if it starts getting a lot of false
positives. Considering it's breathing and heart rate, could we not easily make
it give out false positives? I mean it would probably inconvenience us, as an
individual, but after a few months of tons of false positives, would they not
abort it?

~~~
forensic
It's more important that people believe it works than for it to actually work.

Security theatre works because people believe it works.

------
pavel_lishin
So this system is identify people who are nervous about being misidentified by
this system? Brilliant!

------
Batsu
Hate to play devil's advocate, but it sounds somewhat similar to what Israel
does with their airports, but leveraging technology to identify all details
(and more than a human could reasonably do), rather than training individuals
to do the same.

Methods, deployment (in terms of location) and data retention are obviously
the distinguishing factors here.

<http://news.ycombinator.com/item?id=1024850> for an article on it, from
around two years ago.

~~~
VladRussian
>Hate to play devil's advocate, but it sounds somewhat similar to what Israel
does

who said that Israel is human rights, non-racial/ethnical-discrimination,
etc... champion?

Don't get me wrong - i'm not anti-Israel, i completely understand the
necessity of the war they fighting for their survival. It is just that when
you fight a war for your survival you don't have the luxury of championing
human rights. Israel situation isn't an example to follow, it is a problem to
be fixed when the war is over.

------
rshm
1984 will be a reality in 2084.

~~~
maratd
Oh please. I just got a ticket in the mail because an automated camera decided
I did something wrong. It's here now.

