
But don’t let that distract you; it was designed to kill people (2017) - hypertexthero
https://www.calebthompson.io/talks/dont-get-distracted/
======
eksemplar
I’m not sure his solutions are sound. I don’t think you should violate your
own morals, but the real issue isn’t that your code can be used for wrong
doing, it’s that there are no consequences when it is.

I think weapons manufacturing is a better example than software, because it’s
much clearer. Weapons are necessary to defend free society, and when they are
misused, we hold the misusers to justice.

Or at least we used to. Because today America is actively bombing civilians in
7 different countries, and no one is ever going to be held accountable for the
outright human rights violation. I mean, it’s a war crime to kill civilians. I
don’t think you can blame the weapons manufacturers for this though, and I
don’t think we should have to. Because it’s our common ethics that should
prevent it.

I guess, though, that in a world without accountability, your individual
morals is all that’s left, but it’s our ethics that need to change if we
actually want change. Because if you personally refuse to write the software,
someone else will. Like I said, I think you should absolutely refuse to write
the software because you won’t be able to live with yourself, just don’t
expect the violators to stop unless we stop them.

~~~
3pt14159
Mistakes are going to happen and the world has changed. The battlefield is
everywhere now. It's not good, but from a high level the US leading the world
led to fewer combat deaths.

[https://www.vox.com/2015/6/23/8832311/war-
casualties-600-yea...](https://www.vox.com/2015/6/23/8832311/war-
casualties-600-years)

Note that that graph is logarithmic.

~~~
unstuckdev
One country waging most of the wars and maiming instead of killing is not an
improvement.

~~~
golergka
How is it not?

------
chroma
I don't understand this guy at all. Two things confuse me:

First, the purpose of a military is to be able to break things and harm people
as effectively as possible. Did he think his job would involve making the
military less effective?

Second, why is it bad to work on software that's used to kill people? Killing
isn't necessarily bad. If this software helped kill Abu Bakr al-Baghdadi or
Ayman al-Zawahiri, it would have a hugely positive impact upon the world.

To me, the post comes across as a bunch of meandering moral grandstanding. The
core thesis is either utterly pedestrian ("Think about the consequences of
what you’re building.") or totally fringe (that helping the US military is
immoral).

~~~
jstanley
Lots of people disagree with you, and believe that killing _is_ necessarily
bad.

No matter how bad the person you're killing is, it's still not OK to kill
(except in extreme circumstances of self-defence).

Capital punishment was abolished for a reason.

~~~
flyinglizard
What about killing to protect others, as part of your duty?

I'd argue that _not_ killing certain bad people is an immoral choice in
itself, designed to absolve one's of responsibility at a great price to
others.

"We sleep soundly in our beds because rough men stand ready in the night to
visit violence on those who would do us harm"

~~~
jobigoud
You just need to incapacitate them not kill them.

~~~
nradov
That's not a viable option. There is no such thing as a truly effective non-
lethal weapon.

------
nharada
I think a lot of people are missing the point of this post, especially because
it offers an opportunity to nit pick on specifics. But the point here isn't
"don't write code for the military" or "don't write code for killing" or
anything else, it's that we need to _think_ about if we're okay with the code
we're writing. We need to think about the implications of the things we build.

We're never going to find a hard line we all can agree on, aka "don't work on
missiles, but airplanes are okay" or something similar. But we can all stop to
think about what that line is for us personally, and if what we're working on
crosses it. What would we do if asked to violate that boundary we've created
for ourselves?

When I was in grad school, we had to take an ethics in engineering course. We
split into groups and discussed the ethics of building certain things, in my
group's case building predator drones. While I didn't agree with the person
who said they were okay building them because we need weapons, I was far less
concerned with that individual than the one who looked at me, straight faced,
and said "I don't get what airplanes have to do with morals".

~~~
CamperBob2
_I was far less concerned with that individual than the one who looked at me,
straight faced, and said "I don't get what airplanes have to do with morals"._

They're arguing that morality attaches to the user, not the tool.

See any gun-control argument in history, basically.

~~~
sullyj3
Let's accept for the moment that tools are morally neutral. Notwithstanding
that, when you're thinking in a consequentialist fashion, you don't just
ignore the knowledge you have of how the tools you build are going to be used.
You can say "well it's not up to me, it's not my responsibility to think about
how they'll be used", but you have to be upfront with yourself about the fact
that you're not taking moral responsibility for your actions.

~~~
CamperBob2
_but you have to be upfront with yourself about the fact that you 're not
taking moral responsibility for your actions._

My concern isn't so much the nature or purpose of the tool I'm building, as it
is with whose hands the tool will ultimately end up in.

I'd be OK working on military drones, for instance, if I knew they wouldn't
end up in the hands of avaricious neoconservatives, LBJ-school paleoliberals,
religious nuts, or basically anyone who isn't someone whose judgment I
personally trust.

So in practice I wouldn't want to work on a military weapons project with no
valid civilian uses. But it's not because I believe drones or land mines or
atomic bombs are inherently evil. It's because I have no idea who will
eventually end up in control of those products, and because history suggests
that they will be used in ways I don't approve of.

I have the same attitude towards patents, basically, but that's a different
debate.

------
lifeformed
I think a lot of people here are getting hung up on some imagined implication
that any code that could ever be used for evil is wrong to make. Yes, this is
a grey area and there's no clear line of what the correct ethical choice is in
many situations. But the point is that it's barely discussed professionally at
all. Other disciplines have quite a bit of exploration into their professional
ethics, but software engineering seems to gloss over it. Yes, there is some
precedent, but it seems disproportionately small when considering that it is
becoming an increasingly ubiquitous role in our society.

It seems like it's not even in our vocabulary as engineers to comprehend the
ethics of our work. There's no framework for analysis or disclosure. Sure,
there aren't any easy answers to most situations, but have we even tried?

------
userbinator
Killing people is at the extreme end of ethical concerns, but a lot more
people are working on code that's designed to tighten corporations' control
over users or otherwise take away their freedom and privacy and increase
authoritarianism. Things like DRM, "security" ("because who _doesn 't_ want to
be safe and secure?"), removing functionality that allows
extensibility/interoperability, etc. I've heard it phrased thus: "Do we want
to help them build better nooses to put around our necks?"

~~~
Something1234
I would rather write code to kill people, than to control what they watch or
consume, and how they consume content. Killing a few people is small in the
grand scheme of things. Restricting knowledge or access to knowledge can
destroy entire civilizations. Restricting content produces a world I would not
want to live in.

------
nkcmr
A _lot_ of folks here in the comments are missing the point and "getting
distracted" by the word "kill."

Sure, it was probably a bit short-sighted of the speaker to not connect the
dots and see that this software would be actively used to seek out and kill
people. But to me the core message of this talk applies to the very large grey
area that lies in between fully ethical software and stuff the DoD makes to
kill people (yes killing _can_ be necessary I know, I know, don't get
distracted!). And that grey area is mostly social media and ad-tech.

These are two domains which require software to be written that is actively
harmful to people's privacy and mental health. We see Twitter being used to
target and harass people to the point of suicide. Instagram has been precisely
designed to the point of addiciting its users into a fake world that makes
them feel like they are nothing and that everyone is happier than they are and
it depresses them.

The examples I just cited are prevalent criticism and can start to echo in our
chamber here (HN). But how about a real, recent example: Facebook's use of
two-factor authentication phone numbers in advertisements. Some engineer at
Facebook was given these requirements to implement this super shady and
deceitful functionality and chose to implement it anyways without pushing
back. It takes advantage of folks who were simply trying to improve the
security of their account, but now it is being used to target them with
advertisements.

Most of us will have long careers that don't involve writing software that
will kill people, but a stunning majority of us will be somewhere in this grey
area at one point or another, and you must think about what you make at that
point still.

------
dwheeler
Yes, it's important to understand what you're doing.

However: a country without a military (or an allied country with a military)
is very quickly not a country. If the US and Europe stopped having a (working)
military, they would be immediately taken over by totalitarian regimes who
would be delighted to trample over all the rights and privileges their
citizens currently enjoy. South Korea, Taiwan, and many other
countries/regions would instantly be destroyed by powerful and dangerous
neighbors. Free countries are not free because everyone around the world is
nice; they are free because people are willing to die to protect them.

A military _must_ have weapons that _can_ kill people. The real goal of such
weapons in a western democracy is not to kill people - it's to be _able_ to
kill people so that no one will take over or threaten the country and its
allies (at least not without consequences). It is entirely ethical to enable
self-defense, and self-defense is the purpose of the Department of Defense
(remember, its very name is "Defense"). The ACM code of ethics doesn't forbid
this, because it focuses on unintentional harm, not intentional harm from a
lawful order to protect a country. The author seems to think it's unethical to
enable self-defense, and that's just nonsense. Weapons (and anything else) can
be misused, but we need to hold the misusers responsible - not pretend that
they aren't needed. It's a good thing that military personnel are willing to
risk their lives to protect others, even those who don't appreciate their
sacrifices to do so.

~~~
mpiedrav
A counterexample: Costa Rica (where I'm from). We haven't had an army for the
last 70 years and still remain a country. Superpowers (e.g., US, China,
Russia) might need armies as means of mutual deterrence. Smaller countries not
so much.

~~~
tlow
This appears to be false.

> So, Who Protects Costa Rica? Costa Rica maintains its military-free status
> and does not command any military units or house any war weapons. However,
> the country does maintain alliances with other countries, such as the United
> States, that can be expected assist in the event of war within Costa Rica.
> [https://qcostarica.com/costa-ricas-military-abolition-
> histor...](https://qcostarica.com/costa-ricas-military-abolition-history-
> who-protects-costa-rica/)

~~~
dguaraglia
Hm... not exactly sure how your quote stating that Costa Rica maintains a
military-free status falsifies the grandparent's comment stating exactly the
same.

~~~
albntomat0
It doesn't directly falsify the fact that Costa Rica lacks a military, but
shows that their solution does not scale in a useful manner.

Edit: It directly contradicts the usage of the claim that small countries do
not need a military. Essentially borrowing deterrence from another doesn't
result in the world being military free.

------
twtw
> a tool to use phones to find WiFi signals.

> Does it find phones ... This was never about finding better WiFi. We were
> always finding phones. Phones carried by people. Remember I said I was
> working for a Department of Defense contractor? The DoD is the military. I
> was building a tool for the military to find people based on where their
> phones where, and shoot them.

I got distracted by this utter failure to define the objectives and
requirements of the project. If you want to find phones, don't start by using
phones to find wifi routers.

But yeah, if you work for the DoD, you should probably be ok with the stuff
you are working on being used to kill people. It's a big part of what they do.

~~~
foobarian
And it's not clear that it was a requirement to implement this on an off-the-
shelf phone. There are plenty of developer friendly chipsets out there that
are easy to sniff with that could've been run off a handheld device.

------
United857
Where do you draw the line? Almost any significant technology can be applied
for military purposes. Likewise, the Internet itself came from a DOD research
project; military technologies can be repurposed for peaceful uses.

~~~
sa46
Yes, this is what I don't understand. I'd wager the most dangerous weapon used
by the US military is either Microsoft Word or PowerPoint to produce
operations orders.

It seems that if a product's utility isn't directly tied to a military
outcome, then the outrage never materializes.

Project Maven caused outrage because of the direct link to military drones.
However, advancements in AI and machine learning haven't caused the same
outrage but can be used for the same purpose.

------
skybrian
I feel like this underestimates the complexity of the problem.

Both markets and open-source software run on abstraction. (The official open
source definition doesn't even allow restrictions based on field of use.)

If people want to implement "know your customer" like the banks do, and make
decisions based on their own political values, your customers need to share a
lot more information and there isn't going to be very much privacy. Buying
services is going to require a lot of hoop-jumping.

And then consider the effect on product design. Unless countermeasures are
built in, a copier can be used to counterfeit money. And that's an easy case.

Now we have a simple service for distributing text messages making front page
headlines for its effect on society.

It was naive, but the assumption that customers are responsible for their own
actions was a useful fiction. A society of mutual distrust makes everything
difficult.

------
booleandilemma
The author is naive beyond belief.

What kind of software did he think the DOD would have him writing? Fart apps?

~~~
tokai
Maybe software that wouldn't be used to bomb civilians, without visual
confirmation, in countries that the US is not in war with?

I mean I would be okay with producing firearms to protect my county. I
wouldn't be okay with the CIA sending those guns to kill squads in South
America.

A lot of posters here are treating all warring and killing as equal. But there
is a huge span from proportional and moral use of deadly force, to doing what
would be treated as a war crime if it wasn't carried out by the world's
foremost superpower.

------
r00fus
Great article. I believe it's important to understand purpose of everything
you do.

This story crystalizes that into a very compelling advisory.

Often we do things without that understanding in mind and that can lead to
many problems, including miscommunication, errors and possibly what this
article alludes to.

------
IshKebab
I only skimmed it but this doesn't really make sense.

a) Obviously if you work for the military, killing people is going to be
involved somewhere. That's what the military does!

b) Why would put so much effort into locating wifi routers when they wanted to
find phones? They have no need to hide that objective - it's a perfectly
obvious thing for the military to want to do.

c) I didn't get that far but is he assuming that "find a phone" = "kill the
owner"?

------
forapurpose
The military is an absolutely essential institution, but for the sake of the
civilians _and the soldiers_ , we shouldn't forget its function: Kill people
and destroy their creations. I saw a recruiting commercial for the U.S.
military showing an aircraft carrier and calling it a 'global force for good'.
That misleads recruits: It's a global force of death and destruction, and that
will be your job if you sign up. We don't like to think that, and we can't let
that result in rejecting all use of the military - which is just as
irresponsible - but we must face the reality of a very serious subject so we
can think seriously about it.

When we imagine that the military is something else, especially something
glorious, not only do we risk the worst evil of humanity, war, but we also
harm the soldiers (and sailors): War is very damaging to them, and not just
the dead and the physically wounded, but causing and experiencing death and
destruction results in great psychological harm. Humans are not cut out to do
it: IIRC the details, on D-Day in WWII, half the soldiers didn't fire their
weapons when they should have. After every war, you can read that people 'were
not the same' when they returned; many are damaged. Suicides are (or recently
were) very high among U.S. soldiers, and the current wars are relatively very
low risk for them. I've read interviews with elite special forces soldiers who
talk about how hard it is, psychologically, to kill.

Another consequence is that we put soldiers in positions to fail: We send them
to wars that they cannot win, usually because we ignore the essential
requirement of a stable political outcome - Afghanistan and Iraq are only the
two most recent examples. Many in the U.S. like to imagine an invincible
force, a panacea for international problems, but just a brief glance at
history shows otherwise: since WWII, there has been one clear victory (Gulf
War), two endless stalemates (Korea, Afghanistan), one ongoing quagmire of
mostly negative results (Iraq), and one loss (Vietnam). We also ask soldiers
to do jobs they are not trained for, such as policing: Police are there to
bring and maintain peace and public order; soldiers are trained to do the
opposite, kill and destroy.

When we have a clear idea of a military's function, we can align outcomes with
our values, minimize the use of soldiers precious lives and health, and put
them in a position to succeed. To ignore the reality of the military's
function, of killing people and destroying their civilization, is immoral IMO.

~~~
anticensor
> of killing people and destroying their civilization

You should exchange these two parts, making it "destroying civilization and
killing their peoples". It is ambiguous currently and suggests a dangerous act
of directing a military against its own people, which happened in history and
brought bad results every time.

------
foobarbecue
LOL @ "North Virginia" . Also, when he says R^2, is he talking about Pearson's
correlation coefficient? That paragraph is confusing, and in the next one he
admits being mystified by the idea of gaussian distributions... I guess I
should stop trying to make sense of the technical elements of this article.

------
lifeisstillgood
I don't see this as a "don't help the military" post. This is a post about
needing a _profession_ of software engineering.

There are, clearly, ethical lines. Leaving aside where the line and the
military cross, it is important to think how we will build such a profession -
and enforce membership (which is the whole point)

Also worth noting is a common medical ethics quiz: You are a ER doctor, and a
college football player comes in, RTA, spine shattered, internal bleeding,
conscious and not in pain but needs operation to stem the bleeding.

He clearly and openly states that as he will never walk again, he does not
wish to live.

Do you operate?

Most doctors it seems operate, and oddly that is a violation of most ethical
board recommendations.

~~~
leetcrew
if i understand correctly, it would probably be illegal (at least in the US)
for a doctor not to operate in that situation. the only circumstance i know of
where a doctor can willfully allow a patient to die is when a notarized do not
resuscitate order has been filed. even in this situation, they would still
probably be obligated to try any other means at their disposal to save the
patient.

~~~
zbentley
I don’t think the point is that there is an obvious right answer, but rather
that the existence of formalized (ish) ethics and codes of professional
conduct frequently distinguishes the medical profession from software
engineering.

~~~
leetcrew
sure, but it seemed worth pointing out that the ethical behavior prescribed by
the board is probably illegal. i don't want a formalized code of ethics in my
field that might require me to break the law.

------
wheresvic1
This is a wonderful article and interestingly one of the more poignant bits is
a quote from someone in the film industry (to paraphrase):

"We meet with a lot of startups and the only question is 'Can this be done?'
Nobody is stopping to ask questions regarding the ethics."

That being said however, it is really tricky to come to any easy conclusions.
We live in a complex world and it's not clear even after exhaustive
questioning what damage could be done with the work that one is doing.

Fire was one of man's greatest discoveries - but if you stop and look at all
the dangerous uses it could have been put to, it is quite possible we would
not be sitting here today on HN...

------
dorfsmay
The article is interesting, but not a new problem. Any technology can be used
for good or bad (within a given morale framework). Yes, we should all keep it
in mind, and ensure governments representing us regulate technologies, or do
not use them for unethical purpose. The article doesn't even address this.

Unfortunately, the author writing 6 or 7 long paragraphs to build tension,
repeating the same sentence for drama effect, to finally get to the the
author's pretend shocking discovery that the DOD and its contractors are
producing lethal technologies, ruined it for me.

------
jwilk
Bill Sourour's blog post discussed on HN:

[https://news.ycombinator.com/item?id=12965589](https://news.ycombinator.com/item?id=12965589)

------
cgag
I'd have worked on greyball if it was pitched as a tool for disrupting
justice, but would feel bad if it was for people who threw up in the car too
many times.

I see these kinds of articles constantly. Are there really that many people
out there who aren't aware of their own responsibility for their actions?

------
Buge
He seems to be saying don't build something if it's possible that some people
could do bad stuff with it. That would mean we shouldn't build end to end
encryption, because terrorists could use it to hide. And that we shouldn't
build Tor, because child pornographers could use it.

------
tomcam
The job said “Department of Defense” right on the tin. He was not misled in
the slightest. He said it paid half again as much as other internships, but I
see no mention that he returned any of his bloodsoaked gains.

------
mmirate
... I think the ethical responsibility here is with the trigger-pullers and
their chain of command; not the phone-finders.

~~~
justtopost
Its both. Absolving yourself of hunting jews for nazis because you didnt
personally pull the trigger is morally and ethically bankrupt. Your line of
thinking is exactly the problem.

~~~
leetcrew
how many levels of indirection does it take before you are no longer "morally
and ethically bankrupt"?

is the janitor at CIA partially responsible for extrajudicial killings? is it
morally bankrupt to pay federal income tax, a portion of which will finance
the killing machines?

~~~
eeZah7Ux
> is the janitor at CIA partially responsible for extrajudicial killings?

Yes, obviously to a lower extent.

> is it morally bankrupt to pay federal income tax, a portion of which will
> finance the killing machines?

False dichotomy. Moral responsibility is not boolean.

We take a hundreds of small decisions every day that have social implications.

Obviously choosing to live, work, and pay taxes in countries with governments
with a history of violence has moral implications.

~~~
mmirate
If you look back far enough, nearly every on-land government still extant
today has a history of violence; be it that they conquered the land they
inhabit, that they invaded their neighbors so that they might not be
themselves invaded, that they were established due to their predecessors being
violently overthrown, and so on.

Governments really suck.

------
svilen_dobrev
leave aside the article.. it's done its job..: just read the comments in this
thread. my oh my

------
PeCaN
tl;dr man working for the military is shocked to find out that his code is
helpful to the military. edgy writing about ethics ensues.

------
gammateam
This was my same experience in the DC area, anyone with a Computer Science
degree is getting scooped up by the intelligence community, and you will get
interviewed by spooks because your friends are interviewing with the NSA.

I did some contracting for a Department of Defense subcontractor too, and then
got the hell out of that town.

Those people are twisted. Their ideology is twisted. And your parents are just
excited that you got an interview at any job.

------
malmsteen
I never really get why military has such a bad press in the programmer's
crowd.

I mean sure, killing people is bad in a society but it's precisely because the
world has been so far a succession of ruthless wars between groups of people
that having a military is a good thing to protect you.

I know the "military kills unethically etc.." and "we are a peaceful world
etc.." arguments but maybe people are getting a bit too, and wrongly,
"certain" that it will last. If anything, history has shown that it's a
succession of cycles until the next wars. Better be able to be the strongest
ones.

I've been in my country's military (Europe) and most of the people are not
psychopath (a few, sure). they are normal people thinking their job is an
important one to protect peace at home.

So yeah "just don't go to job that makes you uncomfortable". It's the only
takeaway i get from that article but i don't share the military shaming. Would
have been more convincing with a oil industry or "on-demand market" example

~~~
blotter_paper
>I've been in my country's military (Europe) and most of the people are not
psychopath (a few, sure). they are normal people thinking their job is an
important one to protect peace at home.

If you live in Switzerland, that's a reasonable expectation for those people
to have. If you live in the UK, it's naivety to the point of intentional
ignorance. If you join a military that has a habit of starting foreign wars,
you can't hide behind the make-believe motive protecting your country from
invading forces. You know what you're getting into.

------
coolspot
“I came here with a simple dream...a dream of killing all humans.”

~~~
anticensor
No. Military is designed to kill some people for the sake of some others'
lives. Enemy detection is vital to a military's function. Otherwise it will
obliterate itself.

------
mesozoic
This is pretty dumb. Maybe I have a good imagination for malice but if I
thought through every possible bad use of some code I could never write any
code or produce nearly any item in the world. I suppose this is parallel to
believing if guns themselves are bad or not.

