
The military is building long-range facial recognition that works in the dark - etxm
https://onezero.medium.com/the-military-is-building-long-range-facial-recognition-that-works-in-the-dark-4f752fa713e6
======
csb6
The responses on HN to this facial recognition technology vs China’s facial
recognition technology is mind-boggling. Commenters saw the Chinese tech as
dystopian (rightly so), but yet see this technology as a way to “to make sure
we're getting the right people“, but that we still might want to think about
how its use could eventually go “too far”.

If China’s facial recognition system is currently “too far”, how is this tech
not also already too far? I guess if a technology is only used to recognize
and assassinate foreign nationals, and not surveil citizens (which it will
eventually be used to do), most Americans are okay with it. Some commenters
are critical of this research, but the level of concern in these comments is
way less than on posts about similar Chinese systems.

The point isn’t that this tech could increase accuracy and kill somewhat fewer
civilians compared to the current amount of civilians killed regularly by U.S.
drone and air attacks around the world. The entire basis for this activity -
shooting missiles into civilian areas thousands of miles from home in endless
wars - is the issue. The fact that the military sees a use for this kind of
technology is the core of the problem, and no matter how well it works, it
will only increase the efficiency of assassinations performed by the U.S.
military, not abolish them.

------
ilamont
_“Fusion of an established identity and information we know about allows us to
decide and act with greater focus, and if needed, lethality,” the DFBA’s
director wrote in presentation notes last year._

It also opens up the possibility of weaponry optimized for an individual
target's physical and mental weaknesses, personalized propaganda, and attacks
on people's social connections, including noncombatants.

------
dumbfoundded
Framing is key for how you think about these military technologies.

You could frame this as the government making it possible to kill people in
the dark automatically or as another data input to a vast array of data
sources used to make life/death decisions for high-value military targets.

The technology has the potential to make sure we're getting the right people
but almost certainly its use will be pushed too far. It has a utility and we
should rightly be concerned that it doesn't get used outside of its limited
intended application.

~~~
titzer
Well you mentioned framing, but you aren't even a little worried that the US
military continues to kill thousands of people by drone strike with total
impunity across several undeclared war zones? This isn't a theoretical. This
is going on now. This technology will be used to kill people with absolutely
_zero_ accountability. Maybe we should strive for a planet where extra-
judicial killings are hyper-efficient isn't where we invest our best minds and
our money. I reject the framing where we have to accept this.

~~~
AnimalMuppet
Given that they're killing people, with zero accountability, would you prefer
technology to help them do so more accurately, or not? Would you prefer _less_
civilians as collateral damage, or more?

Yes, I know, you'd prefer that they _stop killing people_. To do that, you
have to either 1) persuade a large number of people to stop trying to kill
Americans, 2) persuade America to get out of the Middle East and just let
things happen over there however they're going to happen, or 3) persuade the
powers that be that, even if we stay in the Middle East, killing people that
want to kill us is not the most effective route to peace.

And those are fine goals. But until you achieve one of those three, it isn't
amiss that, when we're trying to kill someone, we do the best we can to
actually kill one of the people that's trying to kill some of us, and not some
random other person.

~~~
jacobush
It used to be hard to kill someone. You'd need a sniper team, or a strike
force. You'd take out your target and anyone putting up a fight, including
collateral damage. You'd run the risk your team would get killed. Today, you
don't run those risks. The stakes are low. More people can be killed with less
fuss, and little political fallout at home.

~~~
AnimalMuppet
Fair point. There's Somebody's Law (I forget whose) that adding lanes to roads
increases travel time, because traffic increases more than enough to
compensate for the new lanes. This may have a similar effect - decreasing the
per-kill collateral damage (or at least media attention) may result in even
more killing.

~~~
ithkuil
Perhaps "Jevons paradox" was the name on the tip of your tongue?

~~~
AnimalMuppet
No, but it's related. The one I was thinking of was specifically related to
roads and highway traffic.

But, for what I was trying to say on this topic, Jevon's paradox works, too.

~~~
ithkuil
Braess' paradox is specific to roads, although IMHO it's caused by a different
mechanics and IIRC only applies when a route is completely removed (or utterly
crippled to achieve the same effect)

------
justforyou
Given the current state of best of breed, up close facial recognition, using
the term "works" in a life and death scenario is an irresponsible overreach.

"Fail early, kill innocents often" is a terrible paradigm.

~~~
saulrh
Doesn't have to be perfect, just has to be better than human performance. I
heard a _lot_ of stories out of Afghanistan and Iraq that ended up boiling
down to "They had a big thing on their shoulder and it was pointed at a tank
so we had to kill them", nevermind that that was a TV camera one in ten times.

~~~
Jweb_Guru
> Doesn't have to be perfect, just has to be better than human performance.

Firstly, it's about time we stop pretending that people wait until technology
does something better than humans do before they deploy it. It will be
deployed long before that point. Secondly, there are more reasons to be
concerned about this sort of technology than just whether it is effective at
its intended purpose (identifying people who do not want to be found, at long
range, for the purpose of assassination).

~~~
lucasmullens
I guess I'm a bit confused, why would they deploy something when humans are
still better off without it?

~~~
d4mi3n
Politics. Money. Expediency. The cheaper and easier you make it to do
something, the easier it is for you to get signoff to do it. Humans require
oversight and care. A robot staking out an area? Not nearly as much, and folks
don't get nearly as much backlash when a robot doesn't come back home.

------
sudoaza
Soon you'll be droned by face recognition, oh the future!

------
jvanderbot
Tangential: That face is not the one I would have pictured when looking at the
IR image. It looks like some weird "white-hot" version of IR plus ambient
lighting, and once transformed, lost the mustache entirely.

I'm sure there's a reluctance to put out the real capabilities directly, but
I'm also sure there's a reluctance to put out the real weaknesses directly.

~~~
mason55
> _and once transformed, lost the mustache entirely._

Isn't this what you want? A system that could be fooled by a little facial
hair seems like it would be pretty useless.

~~~
jvanderbot
Well I suppose so! It definitely kept the top hair so I assumed it would be
matching as-is.

------
superbrane
Curious if this will work. About 1 year ago I was reading about US military
research looking for a portable/personnel use device capable to combine night
vision and thermal vision in one vision set - don't think they managed to get
a breakthrough with that. This face detection would work great if ported on
such a device.

------
yayajacky
More kill-loops automation? Hope it doesn't kill any QA testers or have
datetime bugs.

Reminds me of Captain America: Winter soldier
[https://www.youtube.com/watch?v=3ru5wM7fl7g](https://www.youtube.com/watch?v=3ru5wM7fl7g)

"Deploy the algorithm. Algorithm deployed"

------
Aaronstotle
Does anyone know of ways to combat these systems, something like a special
pattern that makes it hard to read. China is leading the way in facial
recognition, and I'd be surprised if there aren't any countermeasures
available.

~~~
xnyan
It depends on the scenario. If all you’re trying is stop it from working, the
article says it’s IR based so jamming the sensor (the camera) with a lot of IR
radiation can work quite nicely.

Individually you could do this with bright IR emitting LEDs. On the scale of
the battlefield, a cool trick that is already being done today on yachts of
the rich and private is using a laser to shine a lot of light directly on the
CMOS (sensor element) of the capturing device when the shutter opens.

These methods work but they don’t hide what they’re doing (jamming). It would
be instantly obvious as to what you were doing which would be OK on the
battlefield if you’re not trying to hide your position, not so much in China.

------
EGreg
Every day we are getting closer to slaughterbots
([https://www.youtube.com/watch?v=9CO6M2HsoIA](https://www.youtube.com/watch?v=9CO6M2HsoIA))

------
akeck
A possible result is that adversaries will deploy robots that all emit the
same radiation patterns in lieu of identifiable persons.

~~~
zimmertr
I'm not sure I'd consider this a possibility as much as a guaranteed evolution
of defense.

------
agumonkey
Evolutionary pressure will now make chameleon genes reappear with
morphogenesis located on our faces

------
coldcode
I seem to recall the Terminator UI did all this before deciding what to do. Is
that our future, killer robots wondering around killing suspected "terrorists"
or other undesirables? I suppose if you combine China's social score data
collection and this tech with Judge Dredd like robots, our society will turn
out like a kind of Minority Report where data leads to pre-crime termination.

Not a future I want to be in.

~~~
188201
The future is now. Drone assassination had become a common tactic for killing
suspected. The tactic is pretty effective and it will not go away, no matter
with or without AI.

~~~
jacobush
The _tactic_ is pretty effective.

The _strategy_ is a dragon seed. Every collateral drone strike breeds
resentment for generations. The chickens will come home to roost. (But then
the MIC can sell even more weapons, so I guess everything will be fine.)

~~~
red75prime
How collateral drone strike differs from soldiers' errors and war crimes in
this respect? Or even from killing combatants who are someone's children?

~~~
jacobush
Even the sloppiest teams on the ground don’t routinely kill 100 wedding
guests, time and time again like the drone strikes do. And if they do, they
don’t get to get home at 5 pm and have dinner in their suburb after a deed
well done. Drones scale out with little immediate cost, other than festering
resentment half a world across.

Currently the drone operators experience PTSD from loitering over targets for
hours, but with increased automation, maybe someone won’t even have to look at
the images. Just “authenticate” a strike based on weighed parameters. War can
be made much more streamlined yet.

~~~
red75prime
The time when battles were up-close and personal isn't known for its
peacefulness. With an exception for Pax Romana, where enemies were crushed and
more or less successfully integrated.

~~~
jacobush
Are you saying we have peaceful times because of drone strikes?

I rather believe other factors are at play and we have peaceful times
_despite_ drone strikes.

------
lm28469
When killing the "bad guys" is more important than feeding your own people. I
love how they use the term "target" instead of "person", they're not even
trying to hide it.

------
colechristensen
Prediction: the 21st century will see computer aided technology for
identification and tracking banned like chemical weapons in the 20th.

Corollary: not until widespread use and abuse highlights the danger.

~~~
UnFleshedOne
Likely won't be banned, but limited in scope (so no autonomous firing, but yes
robocop style visual field highlights, etc)

------
jellicle
The only real application here is killing people with drone-launched missiles.

