
Military needs AI help from companies like Google - elsewhen
https://www.axios.com/military-artificial-intelligence-google-contract-5c570912-092c-4378-a54b-cf119296fb38.html
======
eesmith
"Work said Google's participation could have saved lives"

That has been the justification for pretty much every military project. That's
the calculus which says that dropping nuclear weapons on Japan saved the lives
of the many more which would have died during a land invasion. That's the
calculus which says that mass bombings of cities is good because it shortens
the war, which says that it's better to take out a bakery than a trench,
because the will to fight will disappear and result in peace.

But, maybe developing chemical weapons would save lives. Shall we do more of
that? Or biological weapons? How many more tens of billions will we spend on
ineffective (and IMO likely worthless) anti-ballistic missile systems which "
_could_ save lives"?

How many lives have we saved already?

How many have we killed?

Why not just declare the war won and go home?

When do we, in the words of Mitchell and Webb, become the baddies?

"to help the military to stay on the right side of ethical lines"

When have the top leadership of the US military been on the right side of
ethical lines?

I don't think they were on right side when we were involved in the secret wars
in Asia. I certainly don't think they were on the right side during the Indian
Wars. So, when did they get on the right side, and how did that happen?

Did the military listen to Oppenheimer to help guide their ethics? Or did they
prefer to listen to Teller and Herman Kahn? If they didn't listen to
Oppenheimer, why would they listen to Google AI researchers?

"doesn't mean Silicon Valley will shy away from military contracts — it
hasn't, and it won't"

Of course it won't. Silicon Valley exists because of military contracts.
[https://steveblank.com/secret-history/](https://steveblank.com/secret-
history/) .

~~~
zzzcpan
Ethical lines don't exist for killing machines, they have only false dilemmas.

