They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.
I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.
After about 2 months of work, and after the production line got parallelized and speed drastically increased I went to see it work.
What I saw shocked me and I immediatelly quit. It was a production line for handling of female and male young and grown chicks. Debeaking, throat slitting. I was absolutely shocked how none of the superiors told me exactly which product was being handled.
After seeing the horrific product of my work I quit.
Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I doubted the meaning of my work at university, what did I do? Spend 4 years at college to create killing machines? I didn't think I'd ever do that.
I sometimes think this is why we don’t see any intelligent civilizations out there. Intelligence gives rise to deceitfulness and eventually, one selfish actor can bring down an entire civilization intentionally or unintentionally since the weapons get so powerful.
It always amazes me that those big entities exist, because they require such a huge highly educated and skilled human power. What are all those genius at the NSA thinking ? I can't imagine somebody smart enough to work here is not smart enough to understand the consequences of working there. So why are they not quitting ?
Social pressure and money are part of the equation, certainly. I remember when I turned down a Google interview, my close circle though it was weird that I did that, even more for ethical reason.
Having the best toys, budgets and projects certainly helps as well.
But still, I wonder.
Maybe they don't have a problem with working on such projects, because they agree with their end goal?
I learned very young there are differences, like I literally (NOT figurative at all) couldn't see the motivation behind destroying school equipment, drawing on walls, etc. While for some, it's fun.
But even after all those years, it's just so hard to wrap my head around that.
I think the answer might be that they do have the same values, but they have a different model of the world overall, which calls for different ways of achieving those values.
To take this to an extreme, imagine a person fighting for their life against actual criminals threatening vs a person fighting for their life against a random person they think is a shape-shifting alien impostor. Both are internally justified by the same values, one is externally justified by a more correct perception of reality.
I exactly said I can't understand the line of thinking that leads to the decision. That's pretty much all.
And it's a very conflicting view of the word too. Do you see us as enemies as well ?
In which case, you win this argument. I don't want my tribe to fight yours.
As for myself being tribal - it's very hard to observe tribalism in oneself so, while I am not seeing it, I can very well be.
You'd be baffled, wouldn't you ?
It's not a matter of morality, of good or bad, or right or wrong.
I understand the difference exists. It just feels so alien to me I can't get to form the mental model that helps me understand what thinking lead to this behavior.
Also, read the other comments, we answered your points already.
Maybe I'm not reading this correctly, but you're saying that if anyone holds an opinion different than yours your reaction is the same as if they thought "eating babies" was OK?
Things aren't that black and white. People can hold opinions different than your and you can withhold judgement.
See again, you add morality in the equation, talking about black and white. But morality is arbitrary. You can draw the line anywhere, one inch to left or the right, and so on, indefinitely.
So that's not what I said.
What I said was "I can't understand".
Not "it should not be" or "it's impossible" or "they are bad people".
Apparently, you have difficulty to understand that somebody can't understand.
“There are hardly any excesses of the most crazed psychopath that cannot easily be duplicated by a normal kindly family man who just comes in to work every day and has a job to do.”
— Terry Pratchett, Small Gods
Frankly the broad majority of people in the US benefit materially in some way from their existence in ways that they might not even be fully aware of.
I am not arguing this is a good thing, but I think it's a reality. Military power abroad means wealth accumulation at home.
Now if they use it to kill good guys, then it's a problem that needs to be solved.
I would have thought most people are somewhere in between.
Is an enemy combatant with a weapon a bad guy? How about an unarmed enemy combatant? How about an enemy combatant with just a knife, or a rudimentary club? How about an armed civilian who might be an enemy combatant?
Is a bank robber a bad guy? If so, do they deserve to die? How about a white-collar fraudster? How about someone running from the police? How about a mentally ill person running from the police?
This stuff is never clear cut. It's why we have courts and military tribunals. If you can't think beyond 'good guys' and 'bad guys', I don't respect your opinion.
Pay attention to the facts behind the curtain.
My government did enough stupid things for me to not want them to have sophisticated military toys.
Debatable, but regardless - do you think the reason US government have not killed Snowden is the lack of "sophisticated military toys"?
Also, my government did enough stupid things (like killing innocent people in war zones by mistake) for me to want them to have more intelligent weapons.
It's a hard problem, and most glib solutions are, well, just that, glib. Centralized agreements become increasingly difficult as the number of entities increase, for instance, even before we account for scenarios like this where the reward for defection increases proportionally to the number of other participants in the disarmament.
This is also ignoring those cases where there isn't even disagreement; in the real world, for instance, while you can quibble about the exact lines it seems to be the case that the Chinese accept and approve of levels of "invasive government" (to use a Western spin on the idea; I don't know what they would call it exactly) that Magna Carta-descended countries would consider abhorrent, making coordination even harder. (Meanwhile, they consider our lack of coordination or whathaveyou, if not "abhorrent", at the very least "sub-optimal", and possible dangerously socially negligent. As I'm using English here and, like I said, I don't know what they'd call it exactly, I can't help a bit of a Western spin here, but I acknowledge the flip side.)
No argument here, having a strong bully in the room is good for you when he is on your side.
Because not everybody thinks the way you do. That's why we have elections.
I'm sure disclosure would be illegal per your employee contract, but are there any other steps concerned employees can take?
This is never true. No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of. That said, apparently the defense industry is often irresponsible at best.
What the poster meant was "just about any country" in the world. Which we've seen happen time and time again - the restrictions may be nominally in place (for a while), but eventually they get selectively lifted, and/or the companies find a way to circumvent them.
At times (and not at all surprisingly) with the assistance of certain government agencies chartered with the purpose of not only enacting precisely this kind of subterfuge - but perfecting it as an art.
The biggest problem is so-called "dual use" chemicals and compounds where there are legitimate military and non-military uses.
You can build extremely dangerous explosives from chemicals found on the shelf at a drug store, but that doesn't mean that company is in the business of selling explosives.
I'm much more inclined to believe it's the latter.
Sure you can argue that it's different because those are things the common man can get whereas on the state can afford surveillance dragnets and drones but it wasn't that long ago that only the state could afford computers.
Edit: Apparently I struck a nerve.
Technology transfers between military and civilian application all the time. Propeller technology that helped submarines that are now obsolete stay quiet is fine tuned in a different manner to yield more environmentally friendly watercraft. A drone that can disperse insecticides on only the crops that need it can deliver chemical weapons with some slightly different fine tuning.
An 1984 (or 2018 UK if you like) surveillance and law enforcement system could be used to track down corruption in government, suppressing dissidents, identifying insider trading, identifying human tracking, etc. It all depends on who's using it. (I personally don't trust any government to properly wield that kind of power.)
The technology doesn't care. It's all how you use it.
Why not just send them in with their fists.
This line of reasoning has no rational basis.
If we'd gone into Baghdad WWII style, carpet bombing and all that, we'd probably still be fighting a significant war in Iraq. Or the country would be depopulated. Presumably the appearance of ISIS was largely a reaction to western actions in the region, and its initial strength was proportional to the outrage that could be drummed up. Sure, there are probably asinine military commanders that don't account for this, but it isn't the universal rule.
And war hammers have a military context going back centuries and diy versions where used in trench warfare.
Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.
Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Would those angry Googlers be against surgically killing Osama? I think not.
Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.
Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?
Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
It would reduce collateral casualties per target attacked, which would make the drones easier to use with looser target selection criteria, which might both increase number of targets attacked and increase the number and ratio of incorrect-target-selection casualties.
The law of unintended consequences is most likely to sneak up and bite you when you only bother to consider first-order effects.
This is just throwing in an unfounded qualitative thought, not an actual empirical argument against precision weapons.
If you can definitively tell me how many civilian casualties there were in the Iraq war, we could discuss whether this were true.
"Credible estimates of Iraq War casualties range from 150,000 to 460,000. Other disputed estimates, such as the 2006 Lancet study, and the 2007 Opinion Research Business survey, put the numbers as high as 650,000 and 1.2 million respectively, while body counts, which likely underestimate mortality put the numbers as low as 110,000." 
Having a range from 150,000 to 1.2 million makes it kind of hard to discuss, don't you think?
But I think it's also true that carrying out military operations (even precise ones) in unstable parts of the world helps violent extremists gain support among the broader population and does nothing to help alleviate the instability that gives rise to these extremists in the first place.
When I consider how few deaths there actually are from extremist groups operating in Western countries it makes me wonder if the scale of our response is really appropriate to the severity of the issue, and whether our actions aren't helping to perpetuate the very issues they're intended to address.
I think we should have gone down there after 9/11 and punished (killed) everyone that remotely as affiliated to the terrorists and left as soon as we did that. But that is a political action not a military one. Our military budget would be far smaller if all the random bases across the globe would be closed down and we brought the troops back home. If anyone wants to mess with the sleeping giant, then they can quickly pull up records on what happened to Japan and the Axis back in the 1940s.
The article here is talking about developing military applications for better drones. The potential gains to be had from this are more than just better strike capabilities. Those who don't see the potential humanitarian and other activities that might benefit are being short sighted.
I think it’s all fuckery warped self interest from within the government as well. One can grow fat and happy with the billions that flow through defense.
But again, I believe that better tooling will have unforeseen positive and negative consequences. But I think it’s going to weigh more in the positives.
Can you guarantee our government won't initiate illegal aggression?
Won't subvert democracies?
Won't have another Gulf of Tonkin?
Won't target the families of terrorists, as our current President has suggested?
No, you can't guarantee those things.
Guns don't kill people, people do. And people are sometimes evil, and sometimes break the law, and sometimes make mistakes. And sometimes the gun is stolen. So maybe some engineers don't want their company to make any guns.
You don't get to ask, "Would those angry Googlers be against the technology always being used the way they intended?"
Instead you have to ask, "Is it possible for this technology to be used in ways those Googlers would object to?" And of course the answer is yes.
Many on the Manhattan project thought that bombing Nagasaki was completely unnecessary. Some probably thought Hiroshima was unnecessary, that a demonstration of the power would be sufficient.
barring catastrophy, it’s close to 100% inevitable that militaries will become largely autonomized in the coming decades.
Since I have serious reservations about the track record of the United States, I have to wonder if the people I consider the worst are more likely to establish dominance faster than some other country that I like better.
Maybe I should be working to establish the dominance of the country I think is some combination of the best, and the most likely to win that race (if I and my peers were to help them.)
Or, I could acknowledge that the United States is the most likely to establish dominance, and I should be working my ass off to ensure the US will be the best version of itself that it could be.
But I never seen any good guys in my history books or in the news.
Hence I always assume, when given a power to somebody, that the person doesn't have my best interest in mind.
Let's all remember it's possible any of our country become one day a dictatorship. Just because we enjoyed a lot of freedom for the last decades doesn't exempt us from still working like we can loose it at any moment. Because we definitly can.
More pragmatically, with powerful AI, giant communications nets, huge database of everything and everybody, cameras with facial detection and wire typing everywhere, do you really want to add drones to the collections of what the power that be can do ?
Look at the MRI imaging. They are a downstream invention that came from the development of nuclear weapons (nuclear magnetic resonance). How many lives do you think that has saved and improved in the past 70 years?
Now the problem is never the tech, as usual. It's that the society we are living is not constructed in a way that can prevent the tech from being abused.
We are talking about a country that attacked Irak while lying out the WMD and against the vote of the majority of the world, killing countless people for no proven result and living a country still in ruin decades after that.
I'm not really trusting with the governments we have.
The resistance team sank a ship with 18 civilian casualties also the previous unsucsessfull air raids killed more than that.
You wouldn't characterize the Allies in WW2 as good guys? No better or worse than Germany or Japan?
They fear that this current admin and future adminis may depend on as sec Clinton put it “droning” people we simply disagree with rather than actual military adversaries.
The main question is effectiveness of the system, given some baselines.
Other people don't ask if the ends justify the means.
Other people recommend first-strike nuclear attacks.
"Full of"? The world is more peaceful than it's ever been. Extremists do hurt innocent people, and we should not ignore them. But with each choice that we need to make, we should carefully consider pros and cons. Is there really a net benefit here?
I agree with you the world is much safer and there is less deaths from military conflict post WWII. I think that is largely due to the massive military power in a largely benevolent country like the U.S. I think if you magically removed the U.S entirely from the picture other nations would be thrown into conflict to be "top dog".
> Pinker presents a large amount of data (and statistical analysis thereof) that, he argues, demonstrate that violence has been in decline over millennia and that the present is probably the most peaceful time in the history of the human species. The decline in violence, he argues, is enormous in magnitude, visible on both long and short time scales, and found in many domains, including military conflict, homicide, genocide, torture, criminal justice, and treatment of children, homosexuals, animals and racial and ethnic minorities. He stresses that "The decline, to be sure, has not been smooth; it has not brought violence down to zero; and it is not guaranteed to continue."
They most likely don't want to work for a company involved in any killing.
I can't find the quote, but I was reading something about the troubles with the IRA, and the response, that really stuck with me. The author said something like, "When you use lethal force against terrorists it lets them feel that it's fair to use it against you."
We won't defeat violence by violence.
I think we need to challenge ourselves to become less bloodthirsty as we become more technologically capable. (If only to set a good example for our progeny... https://en.wikipedia.org/wiki/I_Have_No_Mouth,_and_I_Must_Sc...)
Once you make the mental flip it becomes really easy to think up defense systems that work without causing any casualties or deaths at all, not our guys, nor civilians, nor the enemy. And if we just can't live with that, we can always kill them later: https://en.wikipedia.org/wiki/Saddam_Hussein#Execution
I can think of a few examples that prove that statement false. WW2?
Here's an article on that very thing. It didn't mention anything about a Saudi Prince though.
But your points are valid. Like anything, it could be used for non-objectional purposes. How morally objectionable do you think the whole of the use of this technology would be, given the country using it, etc? Do you think it would help the US and allies become more or less authoritarian? How long until you think it would be given / sold to local police departments like other military equipment?
As far as your other points, I think they are all valid. In this country we still have the press to bring up rampant abuses, and other branches of the government to keep the executive in check. I suppose as long as the other branches are willing to keep each other in check, then its better for everyone to include US citizens.
You also have to remember that the people in the military aren't robots (not yet at least). They are regular citizens joining a volunteer army for a multitude of reasons, many of which are to serve honorably and of course the steady paycheck and college benefits. Most of the military does a 4 year stint and they go back to being civilians like you and I.
>You also have to remember that the people in the military aren't robots (not yet at least).
I'm not sure what the point is on this comment is, so excuse me if I missed it, but I would think the fact that we have real citizens in the military would be a barrier to atrocities. Putting a drone hive/squadron in the hands of just a few people seems like atrocity would be more likely to happen.
Guess how it's been used? It's been used to "target" everyone. Why? Because it's gotten cheap and easy enough to use on many more people at once - just how automated drone strikes will be soon.
I think you're naive if you think this will "improve" war conditions. Here's one story that may bring you back to reality, and about how these automated drone strikes are more likely to be used in the future:
Do you think this guy was a victim of "inaccurate" drone attacks and "collateral damage"?
And some of them are parts of governments.
What you seem to be saying, in what we hear all the time, is spending money and doing work that increased the power and capabilities of the US military will makes us all better off.
You are also seem to be saying that Osama bin Laden is an extremist.
But in the 1970s, as Afghanistan was working towards becoming a more secular society, the US military and intelligence agencies were arming Osama bin Laden and his fellow jihadis, the proto Taliban and proto Al Qaeda. Who wanted, among other things, for the secularization of Afghanistan to stop, and for an Islamic dominated government to come in. An effort which the US succeeded in, along with their partner Osama bin Laden.
This being the case, I am not exactly sure when bin Laden and people like him became extremists. I suppose it was after the US began it's military occupation of Saudi Arabia. Osama bin Laden opposed the US military occupation of his country.
This may sound equivocal about bin Laden, but the US is more equivocal about bin Laden. I think he never should have been armed by the US. People of a like mind said as much then. Others disagreed.
In other words, if the US is making political errors (or is not making errors and is pursuing negative goals), more power and capability to carry out those erroneous policies will not help matters.
For example, Trump just escalated the conflict in the Middle East this week, which only satisfies religious fundamentalists. Handing him more power to do so will not help things, it will just mean more 9/11s in response to the blow he just landed against Arabs/Muslims.
I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
Maybe I am too cynical, but I genuinely think that the military is only willing to invest in technology that would help them expand current and future operations, disregarding the impact that these will have in the civilian population of foreign territories... probably because it's orders of magnitude cheaper to just pay someone to write a public statement denying every statistics published by neutral NGOs around the world.
They are also told by a lawyer during those briefs essentially "if you break any of the Geneva conventions or outright any of the things we just told you we will swiftly punish you".
That's not even close to true. On any level. You are confusing the inevitable willingness to tolerate civilian casualties with an outright disregard. Civilian casualties have consequences, and they are avoided. Not strictly, but it is a cynical fantasy to imagine that there is a complete disregard for civilians.
I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
I say fuck that.
It's hard to argue how effective this tactic is, being that:
a. most everything relating to this is classified
b. it's very difficult to assess how many terror operation were prevented by those actions, even if you have the classified data above.
An amazing book on this subject of state sponsored assassination I advise anyone to read is Rise And Kill First, by Ronen Bergman, detailing Israel's assassination policy from operational, political and societal perspectives - truely fascinating.
Pakistan is a country with which no formal military conflict with the US exists, but it is not one in which no formal military conflict involving the US exists.
The 2001 AUMF is a (ludicrously open ended) conditional exercise of Congress' Constitutional power to declare war, and the parties targeted in Pakistan are parts of groups to which the executive branch has determined that the conditions in that act apply.
It's a complex issue that can't be summed up by saying "fuck that".
Would you really like to justify killed 4000 persons of unknown status with zero legal proceedings because of...opportunity cost? Because Pakistan might look different if they weren't killed? I am not sure I could keep a straight face through that one. No. Just no.
And yes, missiles from the sky killing people: FUCK THAT. Rule of law. Round up those "terrorists" and interrogate them, charge them with crimes. I have no forgiveness for psychos on any side of this that want to push us back to barbarism.
I'm trying to find on their website about this principle and it doesn't seem to be there in their "values" section
Did they take it out?
If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
They'll find jobs quickly, that's for sure.
But "work pushing humanity forward"? There's precious little of that in any skill sector - not at FAANG salary levels, anyways. The vast bulk of the work that the vast majority of us do is simply about pushing the investor's balance sheets forward - not "humanity".
There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.
Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.
Ultimately, I don’t think this changes anything.
I don't know if my experience is typical or not but I do know that several times in my career I have encountered or been put into a position where it was clear that 'success' was tied to doing something which I felt was also wrong. I have always chosen not to set aside my principles for that success.
And still I know people who have made the immoral choice and reaped the rewards, and then they have used that success to step into places of higher influence or control. They would no doubt argue that they were in a much better place to do good now, because they chose to do something wrong once before.
It is not surprising that this conflict is the underpinning of many dramatic stories.
And even if - wave a very large magic wand - every AI/ML engineer in the United States pledged to not work on military applications, the U.S, would just contract that same work out from the U.K., Canada, etc...
This is true if the engineer is in such a position of power within the company that he can effectively influence things towards the ethical goal. Most of typical for-profit corporation engineers who care about that ethical goal have no such position. If they strongly disagree with what they contribute to, the best decision for them and the ethical goal is to quit.
Only a small percent of engineers at google are AI experts. Sure, more people use it, but they probably just make a service call and get some magic results back.
As far as I can tell, this is the original: https://www.youtube.com/watch?v=9CO6M2HsoIA
Thinking that the military of your own country is "evil" seems a bit puerile to me. Watching too many movies and TV shows can have that effect.
"Every independent investigation of the strikes has found far more civilian casualties than administration officials admit. Gradually, it has become clear that when operators in Nevada fire missiles into remote tribal territories on the other side of the world, they often do not know who they are killing, but are making an imperfect best guess." 
"Leaked military documents reveal that the vast majority of people killed have not been the intended targets, with approximately 13% of deaths being the intended targets, 81% being other "militants", and 6% being civilians." 
"strikes have killed 3,852 people, 476 of them civilians. But those counts, based on news accounts and some on-the-ground interviews, are considered very rough estimates" 
Not only that, we bomb inside of countries that are (sort of?) our allies, without informing them and without their consent:
"Pakistan's Prime Minister, Nawaz Sharif, has repeatedly demanded an end to the strikes, stating: "The use of drones is not only a continual violation of our territorial integrity but also detrimental to our resolve and efforts at eliminating terrorism from our country" 
 https://www.nytimes.com/2015/04/24/world/asia/drone-strikes-...  https://en.wikipedia.org/wiki/Drone_strikes_in_Pakistan
Yeah, I have no problem calling our military evil.
Doing one evil thing doesn't necessarily make someone evil. Doing a few evil things doesn't necessarily make someone evil either.
If a doctor who has saved thousands kills one person, are they evil? What if that person was a convicted child rapist? What if it was in self defense? What if killing them would save a thousand more? What if killing them would save 10,000 more but the doctor doesn't care about that and would have killed them anyways?
Deciding whether a single person is good or evil is a very complex process. Deciding whether a country or a military is good or evil is enormously more complex and needs to take a lot more into account that you just did.
Perhaps we should judge organizations based on what they actually do, as opposed to what they intend to do, or what they tell themselves they intend to do.
Arguably, it is precisely the inability to consider international conflict from a vantage point outside a nationalist worldview that is the root cause of all external wars.
The US military (As have all other militaries that have done anything of note) has done some ridiculously evil things. The drone strike program isn't the worst of them by any means, but it's not its brightest moment, either.
Since they were first developed, NATO partners: Belgium, Denmark, Netherlands, Norway
Other European countries as soon as the mid-80's: Croatia, Greece, Italy, Poland, Portugal, Romania
Middle east since the mid/late 80's:
Bahrain, Egypt, Israel, Iraq, Pakistan, Turkey, Jordan, UAE
Africa: Morocco, Indonesia,
Asia: Singapore, South Korea, Taiwan, Thailand
South America: Chile, Venezuela.
How true this is in a specific case depends on what that country’s military is doing at the time; categorically dismissing the idea that a nation's military can reasonably be seen as evil by a citizen seems more than a bit naive to me.
At the very least, most people agree that killing in self defense is not evil.
I doubt it, considering the cost of manned jets and their limited time-on-target, political fallout, risk to pilots, etc. Drones are stealthy, silent, cheap to operate, have long time on target, and have both surveillance and attack capabilities. Drones are the key enabler for this completely unprecedented type of pervasive, constant warfare.
People who want to make drones "more efficient" at this job are totally missing the point. Making them even more efficient just means they'll be even more widely deployed, which makes the whole problem even worse. The wider deployment completely dwarfs any savings from "less collateral damage". We need to stop drone warfare, not make it better. Return to the rule of law.
The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
... This is a pandoras box that has already been opened I am afraid.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.
But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.
I'm kind of on the fence about wanting to work in that industry.
Whenever I find myself in this line of thought, I always remember: I am not that special. The impact that anyone has on a workplace is small and mostly inconsequential--more important is group momentum and culture. The likelihood that you'd do bad in a bad workplace is much higher than that you'd be able to stand fast and do good--the world just does not work that way.
On that note, I jumped back off the fence. Not for me.
I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.
Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
Do you think USA should have no military?
Would you vote to dismantle it?
Of course, there would be no conscientious objectors if it were a liberal event