
As engineers, we must consider the ethical implications of our work - jvoorhis
http://www.theguardian.com/commentisfree/2013/dec/05/engineering-moral-effects-technology-impact
======
grellas
Evil behavior, however precisely defined, has been and always will be with us.
Technology enhances what we do as human beings and, hence, always has the
potential to be applied to ill uses. If someone, then, takes what you develop
and applies it for a purpose you never intended in creating it, that is an
item beyond your control. Alan Turing - who applied his genius to confer what
can only be called immeasurable benefits on society and who used his skills to
crack Nazi codes to help end a terrible war - is _not_ ethically responsible
for the many consequences inevitably brought into the world simply because
computing power can be used to magnify the effects of human evil. Was he (or
is any other engineer whose technology is misused) a causal agent in the
various bad outcomes we can identify in, for example, the enhanced lethality
of weaponry or in the massive spying by governments on their citizenry? In a
narrow sense, perhaps yes. When one traces things back up a causal chain, one
can theoretically identify every individual actor who made technical
innovations that culminated ultimately in a particular bad use of whatever
type that afflicts us today. But, though a cause-in-fact, Mr. Turing (and the
many engineers who followed him respecting any given facet of computing
technology) is not what the lawyers call the "proximate cause" \- that is, the
immediately enabling agent - of the outcome. Meaning, if you deem it unethical
to build bombs, then don't do work for a defense contractor helping to build
bombs because your every innovation will be immediately applied to a use you
deem unethical. The same for working for NSA in developing sophisticated
spying technology. Or for whatever other ill use you can identify in society.
But, beyond avoiding direct conduct by which you are proximately helping to
cause an outcome you deem wrong, you as a technologist basically have no
control over how your work may be applied by others and, as the collective
results of such work eventually permeate society, your moral responsibility
for the indirect results of your work effectively stand at zero. If the
operative standard were otherwise, then all innovation would stand frozen
altogether because it is always possible to conceive of an ill use for any
technology that makes things faster, more powerful, more efficient, etc. Put
any such thing into the hands of human actors and _some_ bad results are
guaranteed to follow given enough time and opportunity. Thus, unless one is to
freeze all productive activity or is to go insane second-guessing how others
might pervert that which is being done for good, engineers must perforce
ignore tangential ethical implications over which they have no effective
control.

I think it is fair to say that each of us in our given professions (mine being
law) ought to avoid being a proximate cause of something deemed wrong even
though technically legal (for example, I would not be a "mob lawyer" even
though there are some technically very good lawyers who do that work). But
even here that is an individual choice for each actor to make. For engineers,
some may see it as a great opportunity to do advanced work in some of the
social media companies while others may regard such companies as being engaged
in unethical conduct as they at least sometimes use dubious techniques to try
corral us as consumers into their tight little worlds. For any given engineer,
working for such a company doing such things is a matter of conscience. Some
may say yes, others no. The same is true in working for a defense contractor
or for the government. Or for any other work that is legal but ethically
suspect in the eyes of some but not others. It is your choice and it is your
conscience.

The author of this piece reflects what I call the bane of associational
thinking. He uses the royal "we" to define a group (here, engineers) and then
prescribes very broad goals for what "we" "ought" to do. Since all the things
described now stand as a matter of private choice over which the "we" group as
a whole has no say, then the only way to translate this sort of thing into
practical action is to form formal associations, assign things to committees,
and then issue a series of prescriptions on what the group members ought to
do. That may be fine in terms of the association giving rules that amount to
exhortations to do good (who would disagree with that). But, beyond that, do
you really want an organized association dictating what today stand as your
private choices for your career? Or, worse, do you want such an association to
lobby governments to adopt their strictures and give them the force of law? I
would think not. How, then, can "we" do better? Of course, the question is
always described as difficult and is left for further discussion precisely
because it has no real answer apart from acting on individual conscience or
apart from the potentially coercive ones of letting some association or
government dictate your career choices and opportunities. Perhaps this sort of
reasoning is justified as encouraging people to have a heightened conscience
about what they do, and in that respect it is fine. But that is really as far
as it goes before veering into unacceptable alternatives.

Our capacity to do wrong is innate to our nature, as is our capacity to do
good. We should not stop trying to do good through our creativity just because
others can take what we do and commit wrongs with it. Nor should we feel
guilty about what we do as long as we in good conscience can say to ourselves
that we are doing something productive and worthwhile and not directly causing
harm to others. The "we" issue is in reality much more of an "I" issue and,
for that, you should examine what you do carefully and strive for the good
regardless of what others may do with it. If you want to exhort others to do
better by your standards, then all the better. Just don't dictate to them on
matters over which people in good conscience may disagree.

~~~
rayiner
Your point about causation is a good one. Its worth thinking through the
causal chain of how your work is used as an engineer, even if you conclude
that you are not the proximate cause and therefore do not bear responsibility.

I live in the Delaware Valley, an area of the country devastated by engineers.
Engineers built communications and automation technology, which has allowed
executives to outsource and eliminate jobs far more quickly than people here
can get trained for new ones. The human toll of these changes has been higher
than drone and surveillance technology combined.

Are engineers working in communications and automation the proximate cause of
these changes? Maybe or maybe not. But how long or short is the chain of
causation? I think its not very deep, not when an automation company might
market its technology by mentioning the labor cost savings. At the very least,
before anyone gets sanctimonious, they should think about what sorts of
impacts their own work has on other people.

~~~
kreek
In my previous job this hit home quite hard, one day many of the people who
helped us build the system that automated their jobs were laid off.

Then once the system was fully streamlined we developers were laid off as
well.

In addition some of the end product was used in aerospace so there's a good
chance it could be used for 'bad' (depending on your perspective) things.

Now I work in video games, not curing cancer, but in my search I was looking
for a company that at least did no harm.

~~~
munificent
> Now I work in video games, not curing cancer, but in my search I was looking
> for a company that at least did no harm.

Ironically, one of the reasons I left EA was that I saw my CTO and half of the
programmers around me assigned to the task of figuring out how to outsource
more development. That didn't seem like a winning proposition to me.

~~~
Crito
I would say that outsourcing development jobs is _at least_ a wash. Relatively
wealthy people in developed countries may be _(temporarily, let 's be honest)_
out of work, but far less wealthy people living in less developed countries
will have the opportunity to make what is, for them, a good wage.

Automation is harder to justify this sort of way. Outsourcing moves jobs
around, automation is intended to eliminate them _(yeah yeah, we need people
to make and fix the robots, but let 's be real, there is a net loss of jobs
and we can only hope that cheaper products will trigger the creation of new,
largely unrelated, jobs.)_

~~~
morrad
> automation is intended to eliminate [jobs]

Automation has the potential to eliminate jobs, but also has the potential to
allow much more work to be done by a single person, or allowing that person to
do the same work with less effort.

Perhaps I'm overly simplifying your words, but I don't think automation is
inherently "evil". Like all tools or techniques, they may be used toward good
or bad ends.

~~~
Crito
Well, I wouldn't say that automation is evil, and automation certainly can be
used primarily to scale processes, but I think that if a process is already
running at capacity (say, you are already producing more wheat than the world
needs), then automation will tend to reduce prices (or at least costs) and
reduce jobs. The end-game is total automation (hopefully with everybody
enjoying the fruits of that past labor, Star Trek style.)

I think that automation in general is a worthwhile endeavor, but we need to be
mindful of the downsides and modify our society as we implement more
automation to ensure that we are not causing undue harm. I believe that
various forms of social safety-nets will become essential as we march towards
automation's logical conclusion.

~~~
stephenbez
So if we used to build a road by having a group of 50 guys with shovels,
should we just ignore the invention of the bulldozer so these men don't lose
their jobs.

95% of Americans used to work in agriculture. Should we still all be farmers
today because if we adapt technology then some of the farmers would lose their
job?

------
worldvoyageur
Since 1925, in a voluntary ritual near the end of a Canadian engineering
degree, almost-graduates swear an oath and, upon doing so, are given a small
card and a ring.

[http://en.wikipedia.org/wiki/Ritual_of_the_Calling_of_an_Eng...](http://en.wikipedia.org/wiki/Ritual_of_the_Calling_of_an_Engineer)

I swore my oath more than 25 years ago and I still have the card in my wallet
and the iron ring on the smallest finger of my working hand. A couple times a
year, I pull out the card and read it.

The oath reads:

" I [worldvoyageur], in the presence of these my betters and my equals in my
Calling, bind myself upon my Honour and Cold Iron, that, to the best of my
knowledge and power, I will not henceforth suffer or pass, or be privy to the
passing of, Bad Workmanship or Faulty Material in aught that concerns my works
before mankind as an Engineer, or in my dealings with my own Soul before my
Maker.

MY TIME I will not refuse; my Thought I will not grudge; my Care I will not
deny towards the honour, use, stability and perfection of any works to which I
may be called to set my hand.

MY FAIR WAGES for that work I will openly take. My Reputation in my Calling I
will honourably guard; but I will in no way go about to compass or wrest
judgement or gratification from any one with whom I may deal. And Further, I
will early and warily strive my uttermost against professional jealousy or the
belittling of my working colleagues in any field of their labour.

FOR MY ASSURED FAILURES and derelictions, I ask pardon beforehand of my
betters and my equals in my Calling here assembled; praying that in the hour
of my temptations, weakness and weariness, the memory of this my Obligation
and of the company before whom it was entered into, may return to aid, comfort
and restrain. "

~~~
AsymetricCom
tl;dr: how to be a good robot.

~~~
kyzyl
No. How to be a responsible engineer who doesn't get people killed. There are
situations in life that require one to grow the fuck up, and being a
professional engineer is one of them.

~~~
munificent
> How to be a responsible engineer who doesn't get people killed.

Actually, the above oath will maximize the number of people killed if the
sworn engineer happens to be working on a weapon. There is absolutely nothing
in that oath about not doing harm.

~~~
jff
There's a difference between a collapsing bridge (the original reason for the
oath) and a weapon. You knew from the start that the weapon was going to kill
people. I'm ok with designing weapons; my ethical beliefs allow that. It would
be unethical to accept the project but deliver a weapon that didn't work
properly.

~~~
psykotic
> I'm ok with designing weapons; my ethical beliefs allow that. It would be
> unethical to accept the project but deliver a weapon that didn't work
> properly.

Sabotage for a good cause is unethical? You have strange ethical beliefs.

------
kabdib
I've had Richard Stallman personally lambaste me for unethical behavior,
namely by not quitting my job at Apple because he disagreed with Apple's
behavior.

While I agree that ethics are important, there are people who will use ethics-
like arguments to manipulate you.

I make shooter video games. Is that a bad thing? According to some people I'm
evil. I've written software that people used to write software that people
used to kill people, is that okay?

Saying, "We gotta follow ethics" is great stuff, but I'm unwilling to be used.

~~~
icambron
I'm sympathetic to the broad point your making, but your example isn't very
good:

> I make shooter video games. Is that a bad thing? According to some people
> I'm evil.

If you thought that shooter video games were bad and you built them anyway,
that would be poor ethics. That other people think shooter video games are
evil isn't what matters; it's what you think that matters.

~~~
mseebach
The essence is that the people who built the unethical NSA tools most likely
did not consider the things they did unethical, so calling on people to be
ethical as a response to the NSA leaks only work if "ethical" is understood as
"you can't build surveillance tools for the NSA even if you personally believe
it's ethical".

~~~
Crito
There are roughly two sorts of people to consider here. The people who agree
that the project they are working on is unethical (or would agree if they
stopped to think about it), and the people who honestly believe that their
work is ethical.

 _Obviously_ when we tell engineers to consider the ethical implications of
the project that they are working on, we are talking to the first group
(particularly the parenthesized subgroup). The people who believe their work
is unethical won't stop working on it just because we disagree, but that
should not discourage us from encouraging people to grow a spine and not work
on projects that they have ethical objections to.

For people in the second group, we can work with alternative techniques. One
possibility is socially ostracizing and blacklisting people who continue to
work on unethical projects. (As discussed at some length on HN three or so
weeks ago:
[https://news.ycombinator.com/item?id=6714585](https://news.ycombinator.com/item?id=6714585))

~~~
mseebach
I think these debates tend to dramatically overemphasize the size of group
one, to the point of being misanthropic ("grow a spine"?). A fundamental fact
of life is that people have vastly diverging belief systems and that those in
the ivory tower don't always know what's right or best.

There's a tangential group to this, which are the people who work on something
they don't consider unethical, but may change their minds when they learn the
true scale of what they are part of (it's unlikely many people in the NSA
outside the very top have full visibility on all the programs detailed in the
Snowdon leaks, and it's very possible that each, viewed in isolation and with
the right context can be quite defensible).

The second suggestion overemphases just how attractive hanging out with
judgmental purists actually is. Today it's NSA, tomorrow it's people who work
on social gaming, the day after it's finance and the next it's ads. These
sectors are already shunned by (some? large? At least they're vocal.) parts of
the tech community that considers themselves and their endeavours morally
superior, yet they thrive just fine.

~~~
sentenza
Hm.

The NSA is different from all your other examples, though. Finance, social
gaming, ads and pretty much everything else is or can be regulated by the rule
of law.

Intelligence agencies cannot.

There is no precedent of a surveillance society that managed to keep from
turning authoritarian. Why should anybody assume that this time is different?

~~~
mseebach
Of course intelligence agencies can, should be and are regulated by law.
Sometimes they manipulate the law in their favour and sometimes they break it,
but so do the other fields, most notably finance.

But more importantly, law and ethics are not the same thing. Merely not
breaking the law does not make you an ethical person (and that's not the point
of the law). Conversely, breaking some laws under some circumstances does not
make you an unethical person.

------
TheCapn
Its written into our Code of Ethics in Canada that all Engineers are to put
the public good above that of our own personal gain or that of our employer.
Our tasks are to be selfless and to the betterment of all mankind but like any
difficult subject there's a fog of gray areas and open interpretation to be
had. Heck we wear the Iron Ring as a reminder of this sworn duty.

However, the role we claim to take on, true "selflessness" is near impossible
to reach in practice. Before we are engineers we are people and have basic
needs. Without whipping out my Googlefu I would wager there are certainly more
cases of people blowing the whistle and seeing their way of life crumble
around them than there are who get protected by law. There is massive
resistance to change from those who call the shots and if you're to go against
the grain you're looking to sacrifice your career, life, health, etc.. The
engineer says "you need to say something or people will die" but the person
says "your family needs to eat".

In the end I think engineers weigh decisions of ethics in the moment. A
protocol, security appliance or other technology development isn't bad, its
how its governed to say when its harmful and to or how it gets implemented
into the bigger picture that starts to cross ethical boundaries. Any engineer
sees the good in his/her work and is perhaps aware of its dark side but elects
to overlook it for the good it will provide.

Asking engineers to hold themselves to this standard is good. A self-governed
society with regulation helps everyone for the better but I feel anyone tasked
with decisions that effect the public at large should be held to this
accountability. I think I'm preaching to the choir at this point though.

~~~
mseebach
> Its written into our Code of Ethics in Canada that all Engineers are to put
> the public good above that of our own personal gain or that of our employer.

That's great. Which part of "protecting the motherland against terrorists"
would not satisfy that code? I'd bet that the vast majority of the
(necessarily quite competent) engineers that built the NSA infrastructure
believed (and possibly still do) that that is what they're doing. It would
surprise me severely if a majority of them was making a trade-off between
"family eats" and "the right thing to do".

We're not talking about people that plausibly do not have alternative
employment available. These are highly competent engineers and they could go
work for any number of companies with decidedly less evil businesses if they
felt their ethics were being compromised.

~~~
gaius
Sorry accidental down vote on phone.

------
rm999
> Engineers have, in many ways, built the modern world and helped improve the
> lives of many. Of this, we are rightfully proud. What's more, only a very
> small minority of engineers is in the business of making weapons or privacy-
> invading algorithms.

Morality is relative - by defining what is and isn't moral the article loses
the opportunity to make a bigger, more inclusive point. While some people may
think spying is wrong, others, like the Economist
([http://www.economist.com/news/leaders/21588861-america-
will-...](http://www.economist.com/news/leaders/21588861-america-will-not-and-
should-not-stop-spying-clearer-focus-and-better-oversight-are-needed)) have
argued that it's not a bad thing. Similarly, building weapons isn't bad (per
se); as the ancient adage goes: "if you want peace, prepare for war".

My point isn't to start an off-topic debate, it's to point out that morality
is complex and subjective, and it's up to the individual to make their own
decisions. Personally I would never work for the NSA or a defense contractor,
but I understand why some people do. I think building a strong moral code is
very important, and it's a bad idea to let other people do it for you.

~~~
pingswept
I don't know that I would agree with you that morality is relative, but I
strongly agree with your larger point that you shouldn't let other people
build your moral code for you. I also would never[1] work for the NSA or a
defense contractor.

[1] By "never," I mean that I cannot imagine a realistic scenario in which I
would accept such a job. I _can_ imagine an unrealistic scenario in which the
head of the NSA has kidnapped my child and is holding New York hostage with a
ticking time bomb in which I would accept such a job gladly.

~~~
blister
As someone who has done both of those things, the pay is great and the work is
interesting. The only thing that sucks is the work environment. SCIFs are a
terrible place to be a programmer.

~~~
paul_milligram
Are SCIFs terrible work environments for software developers specifically or
does that assessment apply to other roles in the same environment? What made
it so terrible?

------
plinkplonk
This might not be the norm here, but give me _enough_ money and challenging
work, great working conditions, and I'll work for any _legal_ company or
organization, though some organizations would have to put out a great deal of
money vs others!

Given that like most decent programmers here, I can work more or less
anywhere, given _equal_ amounts of money etc, I'd probably go work for Google
than the CIA.

But if the job choice is between fiddling with javascript on some ancient
offshored CRUD codebase(say, been there done that), vs building robots for the
American Army, or the Indian army - [I live in India]), no contest at all.

In other words I won't work for the mafia or child pornographers or anything
illegal, but the CIA/NSA/whoever? In a heartbeat.

Drone targeting software? sure thing. A drone is just the weapon of the day
and not any more illegal than say a smoothbore musket in its day. Should all
the engineers/metallurgists have stuck to making trinkets for the nobility vs
cannons for their army? You are working for the same people in any case.

Every algorithm known to man has both good and bad uses. Not developing
algorithms because they might be harmful is insane.

On a less hyperbolic plane, I like Richard Stallman but would easily work for
Apple or Goldman Sachs (for e.g, given enough money and good work and good
working conditions and coworkers).

The software you'd write for SpaceX isn't that far different from the software
you'd write to control an ICBM. Salarymen who think they can control what the
software they write will be used for by their employers are just deluding
themselves. Do you think the people who built Watson have any control on what
IBM will do with the tech? And this is a company that used computing to help
the Nazis. Should the engineers at IBM not have worked on computers?

Work is just work, a means to exchange your talents for money. Keep it legal,
do good work. Go home and play with the kids.

The way to stop the NSA from doing distasteful things is, in my opinion, to
work to elect people/hold your legislators feet to the fire, and get good
legislation passed not refuse to research/deploy cryptographic/cryptanalysis
algorithms.

I just wanted a little balance to this discussion. Not everyone buys into the
political correctnesses of the day. Sorry for the rant, but the article is
nonsensical, trying to guilt trip algorithm developers.

yes so "People should think about it. But I'm just an engineer, basically."

~~~
dylandrop
"Every algorithm known to man has both good and bad uses. Not developing
algorithms because they might be harmful is insane."

Yep, that missile guiding system someone built for Boeing totally has
primarily good uses in mind.

"Work is just work, a means to exchange your talents for money. Keep it legal,
do good work. Go home and play with the kids."

Dude, seriously? Everyone should have some sense of morals and question
authority. Come on, if we don't question our current way of life, how does
anything get better? Why should we be okay with people dying just so we can
have a nice, cozy life with our kids?

~~~
yummyfajitas
The primary purpose of that missile guiding system is to miss civilians.

Unguided bombs are very cheap and highly effective at taking out the target.

[http://en.wikipedia.org/wiki/Bombing_of_Dresden_in_World_War...](http://en.wikipedia.org/wiki/Bombing_of_Dresden_in_World_War_II)

~~~
makomk
These days, the primary purpose is quite likely to be taking out hardened
targets of one kind or another - unguided bombs are not terribly suitable for
that, unless you use a lot of overkill (as in, nukes against low-tech
adversaries hiding in caves overkill, most likely).

------
icegreentea
Your morality and ethics are your own. How you choose to align your beliefs
with those around you is up to you. Just be aware the non-alignment will cause
friction and conflict, and that that choice was yours.

But more seriously, regardless of your ethical views, we should be more
mindful of the effects of our work cause it's often pretty clear that those
effects aren't actually thought through. Understand that what negative impacts
may have from your frame of reference, and figure out if they are acceptable
for you before hand.

Case in point, say you can develop a system that will 100% secure the
communication and organization of resistance groups in countries with
oppressive government. There is no guarantee that the end result of that is
something you want. Maybe it'll lead to genocide of the once ruling tribe once
the government is overthrown. Maybe it gets replaced with something worse. And
certainly, "bad people" at home will get their hands on it. Can you deal with
that? Gotta make up your mind. No action takes place in a vacuum.

~~~
enraged_camel
>>Your morality and ethics are your own.

No. Only morality is defined by the individual. Ethics are defined by groups
and societies.

[http://www.diffen.com/difference/Ethics_vs_Morals](http://www.diffen.com/difference/Ethics_vs_Morals)

~~~
dragonwriter
The distinction you (and the site you link) propose between "morals" and
"ethics" is _far_ from universally used, even in the domain of philosophy
which concerns morals/ethics.

Notably, its not even _consistent_ with the definitions given in the
"references" provided on the linked site, which seems to adopt the principle
that if you assert something boldly and provided references to "support" it,
it doesn't matter if the references _actually_ support it, people will just
assume your description is authoritative because you gave references.

------
Theodores
Quite a few of my university friends went off to join the military-industrial-
circus and I saw their rationale for doing so change over time. Initially
something like having learned the ADA programming language coupled with a poor
degree result meant that 'defence' was the only option open to them unless
they wanted to stack shelves at a supermarket. Then the family came along,
with the mortgage and the responsibilities of being a mature adult. The
rationale shifted, working for the military machine was now just a means to an
end, for bread on the table, where there are bills to pay. This is what
happens.

~~~
gjm11
I don't understand what change you're describing.

Initially they went to work in the defence industry because it was their only
option unless they wanted to stack shelves.

Later they continued working in the defence industry because it was their only
option to put bread on the table and pay the mortgage.

Those both look to me like "I'm doing it because it's the only option I have
that uses my skills to do something that pays decently".

What I'd find more interesting would be if people tended to shift (1) from "I
do this because I think it is a noble way to serve my country" to "I do this
even though it's horrible because the alternative is poverty" or (2) from "I
do this because the alternative is poverty" to "actually it turns out that
this is a noble way to serve my country".

------
anonee
Posting this anonymously for obvious reasons.

I've worked in the defence sector with embedded systems.

Some engineers do work ethically within unethical companies, particularly
defence companies. There are lots of people working out how best to fuck up
the delivery and implementation of the latest and greatest killing machines
and surveillance technologies. Skilled engineers can also skillfully make
total lemons too. Some high profile projects that failed were pretty much
sabotaged by the staff on ethical grounds.

This is not discussed outside of the organisations, nor is it discussed inside
the organisations in detail but if something is designed to wage war, it might
not get there in the end.

------
sigil
"Do you regret building the internet because of surveillance?" is the new "Do
you regret discovering fission because of the bomb?"

Now, as a technologist I'd never directly contribute to a project whose intent
was to kill or surveil people. (I turned down that cushy DoD contract job
straight out of college.)

But this whole "as engineers we need to consider the ethical implications"
argument is deeply flawed. First, it assumes we can predict how the technology
we build will be used. Second, even if we could predict all the uses, should
we refrain from building something that has some good uses, just because it
has some bad ones? Third...what about the people _actually using the thing for
evil_?

We might as well drop the "as engineers" part and just have a discussion about
not doing evil things in general.

~~~
dllthomas
Obviously we are limited by our predictive capacity, but I don't think "there
might be bad _and_ good uses" is really a strong argument. Like anything else,
you weigh the good against the bad in making your decision.

~~~
sigil
That's the thing though. How _does_ one weigh the good against the bad in
these cases?

If you're building stuff at the application layer, maybe the use is obvious,
but if you're writing a library or a service, how can you know how it will be
used? Should you expend time enumerating and assigning probabilistic weights
to all the good and evil that could come from it?

Far simpler proscription: as a human using a tool, don't do evil with the
tool.

~~~
dllthomas
You do your best, along whatever axes are situationally appropriate.

For what it's worth, a sufficiently generic tool I think tends to balance
toward morally positive, because there is more intent to do good than intent
to do harm out there. But of course, helping to grow that disparity is still
important, which is why you should be looking to see if there's ways in which
your tool radically, disproportionately facilitates harm.

------
bredman
> When doctors or nurses use their knowledge of anatomy in order to torture or
> conduct medical experiments on helpless subjects, we are rightly outraged.
> Why doesn't society seem to apply the same standards to engineers?

I think the difference we draw between the two is that an engineer builds
things that can be used for many purposes (defense or offense for example) and
we as a society generally agree with one of those two uses. When a doctor
abuses their knowledge to cause pain they are the ones making a choice we
disagree with.

Anyways, yes engineers should think about how their work can (and will) be
used but I think it's disingenuous to compare them to patient torturing
doctors.

------
john_b
I'm going to play devil's advocate here.

Assuming that an engineer is aware of the eventual applications for his or her
work and is ethically mature enough to recognize that some of those
applications are evil, destructive, or violate the rights of others, it
doesn't necessarily follow that this engineer should refuse to do the work.
For one, just recognizing that a technology has harmful applications--even
_only_ harmful applications--and refusing to build it doesn't mean that the
technology will not come about anyway. If you find the applications harmful
and refuse to work on it, others may--and likely will--step in and build it
instead. So there is a game theory issue that the article ignores. If many
people can profit (financially, professionally, learning) from building
technology that eventually becomes harmful, then a sufficient number of those
people need to refuse to work on the technology in order for it to not be
developed. But decreasing the scarcity of labor will increase the value of it,
and thus the rewards for developing this harmful technology will be increased,
which creates a more powerful incentive for engineers to "defect" (in the
prisoner's dilemma sense) and work on the harmful tech.

There are also objectively good benefits to doing work that is eventually
harmful. If you have a family, you can feed and clothe them. You can learn
skills and that will help with non-harmful work in the future, and you will
meet people who may help you do non-harmful work later. Being intimately
familiar with this technology, you will be able to warn others of the dangers
of it. Whether these objective goods outweigh the potential bads of a
technology is difficult to determine during its development. And where the
balance is difficult to determine, engineers, being practical and conservative
by nature, will tend to side with the tangible benefits they can get today
over the intangible costs that others may incur in the future.

I agree with the article's point that engineers need to develop a broader
societal focus[1] and be mindful of the potential uses, especially the
unintended ones, of their work. The problem is that awareness alone isn't
going to accomplish much because a world full of aware engineers doesn't
change the existing incentives.

[1] A good book on this topic is "The Civilized Engineer" by Samuel Florman:
[http://www.amazon.com/Civilized-Engineer-Samuel-C-
Florman/dp...](http://www.amazon.com/Civilized-Engineer-Samuel-C-
Florman/dp/0312025599/)

~~~
JoeAltmaier
Moral behavior is not a 'game theory' concept. It is what you do even when you
don't 'win'. At its heart, moral behavior is NOT doing whatever it takes to
benefit yourself, which is the sum of game theory.

So yes, if your project has only harmful application then you shouldn't work
on it. Even if someone else most assuredly will (I'm guessing that could be
you). Because its wrong, hypocritical.

~~~
john_b
If you look at these things at an individual level, individual morality is a
useful lens. But this topic is really about a class of professionals within a
whole society. Influencing behavior through moralizing unfortunately works
much better in the ancient tribal environment in which it evolved, where shame
and ostracization could be used to punish wayward members of the tribe.

The reality of large societies is that a certain percentage of people will do
immoral things even when they know those things are immoral, because they can
get away with it or even profit from it. And they will do this _especially_
when the rewards for immoral behavior increase asymptotically due to supply
constraints.

Using morals to provide solutions only gets you as far as "X should Y". But
prescribing "shoulds" doesn't solve anything. Changing incentives does. How to
do that is the real question.

~~~
JoeAltmaier
Unfortunately we all only really can change our individual behavior. So
individual morals is all there is. Given that, you can 'go with the flow' and
become a person that does immoral things, or you can hold the line and perhaps
be an example. And keep your self-respect.

------
crazygringo
> _Engineers are behind government spying tools and military weapons. We
> should be conscious of how our designs are used_

Except that it's not always clear whether something is ethically good/bad. If
you're working on a system for tracking cell phone calls, this can be used to
protect against terrorism for public safety (good) or discovering mob or child
porn networks (good) or for targeted blackmail (bad) or a surveillance state
(bad) -- and the engineer working on it not only has no guarantees on what it
is being _planned_ for, but also has no crystal ball to know how its use might
change in the future.

And different peoples' ethics are different. So while saying we should
consider the ethical implications of our work _sounds_ nice... it sounds
awfully simplistic.

------
ivanca
Bullshit; it would be like pretending that tires shouldn't have been created
because someday those could be used on war jeeps.

It's not possible to predict the uses of one's work and therefor is impossible
to comprehend its ethical implications. And if you add capitalism to the
equation it means that someone else is going to create the technology even if
you don't just if there is a direct -or even indirect- profit from it
regardless of ethics.

So the "problem" is not a burden on individuals of certain professions (e.g.
engineers) but on basic social structures.

~~~
shabble
> _It 's not possible to predict the uses of one's work_

Probably not in all cases, but you can often make an educated guess.

You could at least consider classes of activity, ranging from:

Primary, sole purpose is to cause direct harm. You'd have to be dumb or
willfully ignorant not to realise. "This BabyFaceMelter technology will be so
amazingly terrible nobody will ever want to fight us again!"

High likelihood of indirect harm - nominally for other purposes, but trivial
to see simple ways in which it can be weaponised/made harmful. "Our Public
Order Droid platforms are all equipped only with non-lethal electro-tickle
cannons, and it'd be _really hard_ to put actual bullets in there"

Moderate likelihood of indirect harm. Fairly compelling "good" uses, but still
possible to think of ways it could go bad. Less toxic or higher power
explosives intended for mining, perhaps.

Too generic to really know - finally, the weakest level that you present,
something like better off-road vehicle tyres. Quite a lot of fundamental
science (ie: not the "we hypothesise that X will make the Anthrax _really
angry_ " sort) has such wide application that you can't really guess. Ditto
basic algorithms in maths & CS, or software libraries (is OpenSSL evil?)

I don't totally disagree with you, but you might want to try painting with a
slightly smaller brush to make your argument more compelling.

------
dreamdu5t
Americans think working for the NSA is ethical. Everyone I've met who has
worked for the NSA or their contractors believes in the surveillance state.

Even the Guardian is not clear on this. Does the Guardian support the
existence of the NSA, or not? You can't have your cake and eat it too.

~~~
pingswept
_Some_ Americans think working for the NSA is ethical. I'm an American
engineer, and I would not work for them, given their recent history.

I would be surprised if you found an American working for the NSA who didn't
do so willingly-- people in that position can easily find other jobs, so they
do.

------
ctdonath
For my B.S.C.E., we were required to take an ethics course. By the end,
everyone was convinced being ethical was a fast track to job loss &
blacklisting.

~~~
RodericDay
same. I still don't know whether the university was deliberately trying to
teach the kids to be obedient by shoving Boisjoly down their throats, or
whether it was merely a side-effect.

------
jpk
This reminds me of Mike Monteiro's talk, "How Designers Destroyed the World".
[http://vimeo.com/68470326](http://vimeo.com/68470326)

He was talking about web/industrial/etc designers, and this is talking about
engineers. It doesn't matter what field you're in, the moral of the story is:
Be cognizant of how your work affects the world, and be prepared to say "no"
to requirements that will likely affect others negatively.

------
tehwalrus
I don't think anyone actively works to do evil things in engineering. However,
it is possible/easy to end up making something that _is_ used for evil by
someone else later - and justifying making weapons through notions of
"defense" has been commonplace for years (although I don't agree with the
justifications, generally).

I've never quite been happy with open source licenses for this reason - and
I've considered writing a pacifists' OSS license with a clause along the lines
of:

"No license is granted for any purpose of deliberate harm or death to humans
[and other intelligent species]; whether in a deliberate weapon or implement
of torture, or whether in a targeting or guidance system for any such device.
Missile 'sheild's, designed and only usable for blocking incoming weapons are
excluded from this category."

You could hypothetically write additional clauses for other objectionable
things, pornography and fossil fuel drilling/mining come to mind.

It wouldn't stop people reimplementing your code from a specification/summary,
as many companies do with OSS that they need to use but can't re-release, but
it does make making weapons and such more expensive to make (which will
decrease supply, assuming your code was ever in contention for use in such a
system.)

------
pavanky
The problem with statements like this is, whose ethics should one be held
against ?

The ethics of the majority population are usually more misses than hits.
Ethics of the peers will perhaps be better but will still suffer from
different kinds of prejudices (for ex. discovery above all else among
scientists).

This leaves the ethics of self which most people should and will follow. The
only thing anyone else can do is make sure people are held accountable for the
consequences of their actions.

------
pointernil
.) German speaking engineers can validate/reevaluate their "position" by
checking out:

[http://de.wikipedia.org/wiki/Gesinnungsethik](http://de.wikipedia.org/wiki/Gesinnungsethik)

[http://de.wikipedia.org/wiki/Verantwortungsethik](http://de.wikipedia.org/wiki/Verantwortungsethik)

It's strange to see those wikipedia topics don't have an English translations
(yet).

.) It's certainly interesting to see in which ways engineers who chose to
follow "Gesinnungsethik" justify their "choice"... interesting, in the sense
of: not differently at all compared to any other profession. I think this is
the result of last decade(s) or so of overdone and misunderstood
"individualism" culture.

.) IF software eats the world - and it certainly looks like this right now -
we better make the human beings who wield and create these very powerful tools
more sensitive to the responsibilities and consequences of their works.

.) I'm sure true treasures can be found regarding this topic in similar
discussions which took place in the non-software engineering community decades
ago and I guess the statements and positions used there are applicable in the
software world.

------
kevinchen
I take issue with the photo caption, which is repeated in the article:

> Engineering ethics are mostly technical: how to design properly, how to not
> cut corners, and how to serve our clients well.

The first thing in the ACM code of ethics talks about protecting "fundamental
human rights." [http://www.acm.org/about/code-of-
ethics](http://www.acm.org/about/code-of-ethics)

Same deal with the IEEE: the first item talks about the "welfare of the
public."
[http://www.ieee.org/about/corporate/governance/p7-8.html](http://www.ieee.org/about/corporate/governance/p7-8.html)

The issue is not that engineering ethics is too technical. It stems from the
fact that everybody defines what is ethical for themselves -- there are few
universal opinions on ethics, just as with political opinions.

------
whiddershins
I could write music for commercials, it pays very well. I choose not to on
ethical grounds. If it were my only choice to feed my family, I would change
my mind. But it is hard to truly terrorize people with music, although I've
been accused of it. ;)

Should personal ethics factor in to what we do as work? Absolutely. More than
most ethical decisions, what we do as work can profoundly affect the world
around us. Are these distinctions easy and clear? No. Is that a reason to stop
thinking about it and just do whatever is most fun and pays best? In my
opinion, no. This is one of the greatest questions we all face, how our daily
actions affect others, and I no longer expect quick and definitive answers.

But it means a lot just to keep looking.

------
theg2
Your profession doesn't define your ethics, you should define your own. Anyone
telling you that you have to be ethical because of your profession is trying
to coerce you to their way of thinking.

------
ahomescu1
I find it heavily disturbing to blame engineers for something they had no part
in deciding. If a politician/general/manager decides to use some technology
for "evil" purposes (for whatever definition of evil), the responsibility
should fall primarily on whoever makes that decision.

Engineers who are unhappy with how their work is used are free to quit their
jobs and work on something else (in most cases). However, since they get
little say in how their work is used, they should also get little
responsibility over it.

------
toblender
In Ontario, and much of Canada, to call yourself an Engineer you must pass an
exam and meet the requirements. Part of this exam is about ethics. In Canada,
Engineers have an iron ring to remind them of their duty to society.

The issue is the term "duty" and "public" can get tricky.

How far does duty go? Does it include compromising morals?

Does the "public" include the enemy that we would deploy the technology
against?

There is no answer that is without some shortcoming.

------
rjknight
I think this is a good point, but it runs into some particular problems when
dealing with software, in particular open source software, where there is no
direct relationship between the person _creating_ the software and the person
_using_ it - in fact, it's not automatically true that the person creating the
software knows anything at all about who is using it.

~~~
lowmagnet
We can only make certain that our individual contributions are ethical,
correct, and secure. It helps a little to assuage those particular fears, I
think.

------
0xdeadbeefbabe
People will use good software, and they'll use it for good and evil. Think of
how nginx is used. Software isn't like an article for theguardian.com, which
can take an ethical position.

Paralyzing engineers by demanding they consider social complexities seems like
a bad strategy for getting things done.

------
laveur
Am I the only one that is disappointed that there was no image from the Movie
"Real Genius"?

------
logfromblammo
Do not call upon engineers to reject unethical projects. There is always the
danger that they will redesign the society's ethics instead of the work!

------
crazy1van
Shouldn't everyone consider the ethical implications of their work? I don't
see anything unique to engineers in this regard.

------
lettergram
"I don't intend to build in order to have clients; I intend to have clients in
order to build."

------
leeoniya
it would be a strange feeling indeed working on surveillance code that could
(and almost inevitability will) be deployed against me.

~~~
angersock
Lucky for you just today Grouper has decided to give you that opportunity:

[https://news.ycombinator.com/item?id=6855114](https://news.ycombinator.com/item?id=6855114)

