I once worked for a large industrial group in Europe. The kind that has a bit of a piece of every little pie there is - military, transportation, etc. I was pretty happy working there .. until I got a demo from the 'defence' group.
They demonstrated the willingness to push the company's technology into heinous, heinous territory. The kind of thing where a drone would be able to follow a single person in a crowd, and target them for execution - unguided, of course.
I quit the next day. Those of us who make technology, need to be very sure we see that it is not used destructively against the human species. The responsibility is very, very high. And, the danger is extreme. These people were revelling in the fact that they could develop targeted assassination drones and sell them to any country in the world.
I'm not surprised. After finishing my degree, I went to optimize a couple of production lines. I didn't get much detail and thought it was for plastic/glass bottles (filling liquids and moving them around - many tiny grabbers, simple movement, not that wide tracks).
After about 2 months of work, and after the production line got parallelized and speed drastically increased I went to see it work.
What I saw shocked me and I immediatelly quit. It was a production line for handling of female and male young and grown chicks. Debeaking, throat slitting. I was absolutely shocked how none of the superiors told me exactly which product was being handled.
After seeing the horrific product of my work I quit.
Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I doubted the meaning of my work at university, what did I do? Spend 4 years at college to create killing machines? I didn't think I'd ever do that.
> Since then, I'm not surprised, given what horrors we do to living animals, that we are ready to do them to each other.
I sometimes think this is why we don’t see any intelligent civilizations out there. Intelligence gives rise to deceitfulness and eventually, one selfish actor can bring down an entire civilization intentionally or unintentionally since the weapons get so powerful.
That's the thing isn't it ? They can't do anything alone.
It always amazes me that those big entities exist, because they require such a huge highly educated and skilled human power. What are all those genius at the NSA thinking ? I can't imagine somebody smart enough to work here is not smart enough to understand the consequences of working there. So why are they not quitting ?
Social pressure and money are part of the equation, certainly. I remember when I turned down a Google interview, my close circle though it was weird that I did that, even more for ethical reason.
Having the best toys, budgets and projects certainly helps as well.
Have you considered that there are people out there with values that differ from yours? It's not wildly impossible, regardless of what the screaming minority would have you believe.
Maybe they don't have a problem with working on such projects, because they agree with their end goal?
I learned very young there are differences, like I literally (NOT figurative at all) couldn't see the motivation behind destroying school equipment, drawing on walls, etc. While for some, it's fun.
But even after all those years, it's just so hard to wrap my head around that.
They probably understand others have different values. The confusion is, some values seem so self evident that its bewildering anyone with a certain level of intelligence wouldn't realize the same values.
I think the answer might be that they do have the same values, but they have a different model of the world overall, which calls for different ways of achieving those values.
To take this to an extreme, imagine a person fighting for their life against actual criminals threatening vs a person fighting for their life against a random person they think is a shape-shifting alien impostor. Both are internally justified by the same values, one is externally justified by a more correct perception of reality.
I'm glad someone made this comment. It blows me away when people think their set of values is the only "right" way to think and anyone else who has a different set of values is obviously damaged or doing it unwillingly.
There is nothing extraordinary or even unusual in this. It's basic tribalism: my tribe is made from virtuous people who are also smart and beautiful, the enemy tribe, on the other hand, consists of deformed degenerates with the room temperature IQ.
Did not you imply that smart people cannot possibly be in the other tribe so there has to be some reason why your tribe members are not quitting the enemy tribe cause?
I never used anything close to "cannot possibly". I said I could not understand. I used the words "why" and "I wonder". It's very clear. It's not ambiguous. But you seem very attached to your tribe theory, so you want my past comment to fit your present view.
And it's a very conflicting view of the word too. Do you see us as enemies as well ?
In which case, you win this argument. I don't want my tribe to fight yours.
Sorry, then, if you didn't want to say that then I don't understand why you posted your message at all. It then literally amounts to "I cannot understand other people motivation" and all the references to intellect and understanding consequences and shiny toys are irrelevant.
As for myself being tribal - it's very hard to observe tribalism in oneself so, while I am not seeing it, I can very well be.
Well, let's say suddenly you learn a community of elite thinkers eat babies and think it's good way to control population.
You'd be baffled, wouldn't you ?
It's not a matter of morality, of good or bad, or right or wrong.
I understand the difference exists. It just feels so alien to me I can't get to form the mental model that helps me understand what thinking lead to this behavior.
Also, read the other comments, we answered your points already.
you learn a community of elite thinkers eat babies
Maybe I'm not reading this correctly, but you're saying that if anyone holds an opinion different than yours your reaction is the same as if they thought "eating babies" was OK?
Things aren't that black and white. People can hold opinions different than your and you can withhold judgement.
They just have a different opinion on eating babies. People can hold opinions different than your and you can withhold judgement.
See again, you add morality in the equation, talking about black and white. But morality is arbitrary. You can draw the line anywhere, one inch to left or the right, and so on, indefinitely.
So that's not what I said.
What I said was "I can't understand".
Not "it should not be" or "it's impossible" or "they are bad people".
Apparently, you have difficulty to understand that somebody can't understand.
It's literally the same thing that allowed concentration camps to happen. Or, in the words of the late Sir Terence:
“There are hardly any excesses of the most crazed psychopath that cannot easily be duplicated by a normal kindly family man who just comes in to work every day and has a job to do.”
Arguably the majority supports the ideological foundations, aims, and methods of the institutions you are speaking of. In particular the military and intelligence institutions. If they didn't, they would have been removed by vote or force some time ago.
Frankly the broad majority of people in the US benefit materially in some way from their existence in ways that they might not even be fully aware of.
I am not arguing this is a good thing, but I think it's a reality. Military power abroad means wealth accumulation at home.
Interesting. I have absolutely zero ethical concerns with Google building a drone which can more accurately target someone. I'm also perfectly fine with US government using those drones to kill bad guys.
Now if they use it to kill good guys, then it's a problem that needs to be solved.
My point is that it's not at all easy to determine who's 'good' and who's 'bad'. In fact I think those are extremely simplistic labels to tag people with, expecially when discussing what could amount to summary execution.
Is an enemy combatant with a weapon a bad guy? How about an unarmed enemy combatant? How about an enemy combatant with just a knife, or a rudimentary club? How about an armed civilian who might be an enemy combatant?
Is a bank robber a bad guy? If so, do they deserve to die? How about a white-collar fraudster? How about someone running from the police? How about a mentally ill person running from the police?
This stuff is never clear cut. It's why we have courts and military tribunals. If you can't think beyond 'good guys' and 'bad guys', I don't respect your opinion.
I'm still not sure what your point is. Are you saying that people in US government/military/etc who decide who to kill make bad decisions? If so, give some examples, and suggest how we can improve the situation.
Debatable, but regardless - do you think the reason US government have not killed Snowden is the lack of "sophisticated military toys"?
Also, my government did enough stupid things (like killing innocent people in war zones by mistake) for me to want them to have more intelligent weapons.
It isn't hard to imagine a scenario in which the majority, or even the strong majority, of each nation disapproves of its own military or intelligence techniques, but nevertheless nothing can change because any nation that unilaterally drops some questionable technique (and don't just think "torture" here, but "excessive surveillance of the home population" and such) pays penalties vs. the other nations and experiences no benefit, making it very difficult for even one nation to climb the resulting gradient, let alone the entire world.
It's a hard problem, and most glib solutions are, well, just that, glib. Centralized agreements become increasingly difficult as the number of entities increase, for instance, even before we account for scenarios like this where the reward for defection increases proportionally to the number of other participants in the disarmament.
This is also ignoring those cases where there isn't even disagreement; in the real world, for instance, while you can quibble about the exact lines it seems to be the case that the Chinese accept and approve of levels of "invasive government" (to use a Western spin on the idea; I don't know what they would call it exactly) that Magna Carta-descended countries would consider abhorrent, making coordination even harder. (Meanwhile, they consider our lack of coordination or whathaveyou, if not "abhorrent", at the very least "sub-optimal", and possible dangerously socially negligent. As I'm using English here and, like I said, I don't know what they'd call it exactly, I can't help a bit of a Western spin here, but I acknowledge the flip side.)
> Frankly the broad majority of people in the US benefit materially in some way from their existence in ways that they might not even be fully aware of.
No argument here, having a strong bully in the room is good for you when he is on your side.
What about all the open source? I'l hazard that that quite a few instruments of war and mayhem run on FOSS. Is there a moral dilemma there? Should FOSS licenses make provisions for the type of use packages can be used for (I believe they should)?
I applaud your morality. In terms of moving the needle towards a more moral world, do you think employee resignations have a positive effect or does it create a less moral survivor bias where the people who are left have fewer limits or concerns?
I'm sure disclosure would be illegal per your employee contract, but are there any other steps concerned employees can take?
Yeah, pretty sure it's a good thing for those companies/projects. Someone who could've been sabotaging their work just quit right away, so they can get someone who will actually like working there. It's not like the employee takes away funding or resources (well, unless they're a genius/influencer, but then again, a good enough motivated replacement is fine).
This is never true. No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of. That said, apparently the defense industry is often irresponsible at best.
No government will allow its companies (and in this context that's how it's seen) to sell weapons to countries they don't approve of.
What the poster meant was "just about any country" in the world. Which we've seen happen time and time again - the restrictions may be nominally in place (for a while), but eventually they get selectively lifted, and/or the companies find a way to circumvent them.
At times (and not at all surprisingly) with the assistance of certain government agencies chartered with the purpose of not only enacting precisely this kind of subterfuge - but perfecting it as an art.
Even if a firm in country A can sell only to group of countries X, another firm in country B might be able to sell to group of countries Y. All you really need is a few companies with overlapping markets to eventually bring such tech to the entire world.
Same thing. Chlorine gas is commonly used in many industrial processes, but it's also a highly toxic chemical that has been used (and banned!) in warfare.
The biggest problem is so-called "dual use" chemicals and compounds where there are legitimate military and non-military uses.
You can build extremely dangerous explosives from chemicals found on the shelf at a drug store, but that doesn't mean that company is in the business of selling explosives.
Google leaders are either very naive and actually think the Pentagon wouldn't repurpose their tech for killing people, or they know exactly that this is how it will used but they agreed to sell it anyway, because money.
I'm much more inclined to believe it's the latter.
I can beat you to death with a hammer or stab you to death with a knife. Should we not build hammers and knives? Should hammers and knives be subject to KYC laws?
Sure you can argue that it's different because those are things the common man can get whereas on the state can afford surveillance dragnets and drones but it wasn't that long ago that only the state could afford computers.
Edit: Apparently I struck a nerve.
Technology transfers between military and civilian application all the time. Propeller technology that helped submarines that are now obsolete stay quiet is fine tuned in a different manner to yield more environmentally friendly watercraft. A drone that can disperse insecticides on only the crops that need it can deliver chemical weapons with some slightly different fine tuning.
An 1984 (or 2018 UK if you like) surveillance and law enforcement system could be used to track down corruption in government, suppressing dissidents, identifying insider trading, identifying human tracking, etc. It all depends on who's using it. (I personally don't trust any government to properly wield that kind of power.)
The technology doesn't care. It's all how you use it.
If you were to try and kill me with a knife, your life is on the line as well. If you try to kill me with a drone, your risk is only the money and time spent on the drone. There is quite a difference.
I'm unclear as to what the moral difference is. It's more moral to kill someone if they can fight back? Then why are we giving our soldiers tanks? Guns? Fighter jets? Knives?
The difference is the cost to yourself, or people you care about, when you want someone dead. Popular sentiment turned against the Vietnam war in the US because Americans were coming home dead, not because of the Vietnamese that we were killing. When we eliminate the cost of war, we make it more likely to enter one, and stay in one.
Yeah, the American populace might not care, but the victims do care, and how upset they are made directly affects all other efforts in that area. Peacemaking and nationbuilding only gets more difficult the more people that have lost family members to the invaders.
If we'd gone into Baghdad WWII style, carpet bombing and all that, we'd probably still be fighting a significant war in Iraq. Or the country would be depopulated. Presumably the appearance of ISIS was largely a reaction to western actions in the region, and its initial strength was proportional to the outrage that could be drummed up. Sure, there are probably asinine military commanders that don't account for this, but it isn't the universal rule.
I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
Like it or not the world is full of extremists who would like nothing more than to hurt innocent people. There is no “oh just send the cops and arrest them!” route to take.
Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Would those angry Googlers be against surgically killing Osama? I think not.
Better drone software might help track a potential target and present with the optimal window in which a target could be shot and have reduced civilian casualties. It could also present with better intel to let a surgical ground strike which would put more American soldiers at risk but would allow for better intel and again less civilian deaths.
Lastly, it could offer new knowledge and experience in tracking humans with drones during humanitarian disasters. It could also help in tracking victims of kidnapping, are the Googlers opposed to rescuing the hundreds and thousands kidnapped by Boko Haram and company?
Who is going to go into the African heart of darkness to rescue those people? Is it the arm chair Googlers who pretend to know better?
> I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
It would reduce collateral casualties per target attacked, which would make the drones easier to use with looser target selection criteria, which might both increase number of targets attacked and increase the number and ratio of incorrect-target-selection casualties.
The law of unintended consequences is most likely to sneak up and bite you when you only bother to consider first-order effects.
Ok... but even accounting for civilian casualties due to increased use of drones, civilian victims (and overall casualties) of US military operations has done nothing but drop as technology improves.
This is just throwing in an unfounded qualitative thought, not an actual empirical argument against precision weapons.
> civilian victims (and overall casualties) of US military operations has done nothing but drop as technology improves
If you can definitively tell me how many civilian casualties there were in the Iraq war, we could discuss whether this were true.
"Credible estimates of Iraq War casualties range from 150,000 to 460,000. Other disputed estimates, such as the 2006 Lancet study, and the 2007 Opinion Research Business survey, put the numbers as high as 650,000 and 1.2 million respectively, while body counts, which likely underestimate mortality put the numbers as low as 110,000." [1]
Having a range from 150,000 to 1.2 million makes it kind of hard to discuss, don't you think?
I agree with you partly, and I think often people who are critical of all military actions are not considering the importance of a strong military in deterring large scale conflict.
But I think it's also true that carrying out military operations (even precise ones) in unstable parts of the world helps violent extremists gain support among the broader population and does nothing to help alleviate the instability that gives rise to these extremists in the first place.
When I consider how few deaths there actually are from extremist groups operating in Western countries it makes me wonder if the scale of our response is really appropriate to the severity of the issue, and whether our actions aren't helping to perpetuate the very issues they're intended to address.
I agree with you as well. I truly believe that the military is for the "defense" of the U.S. Sometimes though a good defense can be a good offense. But by and large I think the 17 years spent in the middle east has been wasteful and largely unproductive. Having said that, there is a number of girls for example who are able to study past elementary school who would have never had the chance to. So maybe not all is wasted?
I think we should have gone down there after 9/11 and punished (killed) everyone that remotely as affiliated to the terrorists and left as soon as we did that. But that is a political action not a military one. Our military budget would be far smaller if all the random bases across the globe would be closed down and we brought the troops back home. If anyone wants to mess with the sleeping giant, then they can quickly pull up records on what happened to Japan and the Axis back in the 1940s.
The article here is talking about developing military applications for better drones. The potential gains to be had from this are more than just better strike capabilities. Those who don't see the potential humanitarian and other activities that might benefit are being short sighted.
I'm not sure why you're being downvoted. I think what you're saying is pretty reasonable, and I definitely agree about some of the benefits of drones. I'm not sure how effective killing terrorists actually is given that our ostensible ally in the region, Saudi Arabia, appears to tacitly support a lot of the Islamic fundamentalism that gives rise to these extremists. I think a big issue that no one knows how to address is how to combat a religious ideology that does not admit the possibility of making any compromises.
Heh, is it hard to believe why I’ve gotten downvotes on this response? Lol. Anyhow, I’m not thinking of any particular religion or group of people in this case.
I think it’s all fuckery warped self interest from within the government as well. One can grow fat and happy with the billions that flow through defense.
But again, I believe that better tooling will have unforeseen positive and negative consequences. But I think it’s going to weigh more in the positives.
Can you guarantee the technology won't fall into enemy hands?
Can you guarantee our government won't initiate illegal aggression?
Won't subvert democracies?
Won't have another Gulf of Tonkin?
Won't target the families of terrorists, as our current President has suggested?
No, you can't guarantee those things.
Guns don't kill people, people do. And people are sometimes evil, and sometimes break the law, and sometimes make mistakes. And sometimes the gun is stolen. So maybe some engineers don't want their company to make any guns.
You don't get to ask, "Would those angry Googlers be against the technology always being used the way they intended?"
Instead you have to ask, "Is it possible for this technology to be used in ways those Googlers would object to?" And of course the answer is yes.
Many on the Manhattan project thought that bombing Nagasaki was completely unnecessary. Some probably thought Hiroshima was unnecessary, that a demonstration of the power would be sufficient.
If I were given the choice, I would support a country that has a track record that I am proud of.
Since I have serious reservations about the track record of the United States, I have to wonder if the people I consider the worst are more likely to establish dominance faster than some other country that I like better.
Maybe I should be working to establish the dominance of the country I think is some combination of the best, and the most likely to win that race (if I and my peers were to help them.)
Or, I could acknowledge that the United States is the most likely to establish dominance, and I should be working my ass off to ensure the US will be the best version of itself that it could be.
But again this assume the people giving order to the drone are the good guys.
But I never seen any good guys in my history books or in the news.
Hence I always assume, when given a power to somebody, that the person doesn't have my best interest in mind.
Let's all remember it's possible any of our country become one day a dictatorship. Just because we enjoyed a lot of freedom for the last decades doesn't exempt us from still working like we can loose it at any moment. Because we definitly can.
More pragmatically, with powerful AI, giant communications nets, huge database of everything and everybody, cameras with facial detection and wire typing everywhere, do you really want to add drones to the collections of what the power that be can do ?
If you can foresee what potential benefits each technological application has, then maybe i agree with you. There are plenty of examples where military applications and research has led to a bonanza of side applications which improve the human condition.
Look at the MRI imaging. They are a downstream invention that came from the development of nuclear weapons (nuclear magnetic resonance). How many lives do you think that has saved and improved in the past 70 years?
We could have gotten nuclear tech without the will to kill people. There are smart scientists outside of the military, and a need for power plants. Which everybody agree would have been built with safer tech without the need for the bomb.
Now the problem is never the tech, as usual. It's that the society we are living is not constructed in a way that can prevent the tech from being abused.
We are talking about a country that attacked Irak while lying out the WMD and against the vote of the majority of the world, killing countless people for no proven result and living a country still in ruin decades after that.
I'm not really trusting with the governments we have.
The problem is people believe in a fair world and that our foes operate with the same values so that if we accommodate them we’ll be okay and therefore we should not develop means to better wage assymetrical warfare because that’s an unfair advantage. Moreover, what if they are in the right and we are in the wrong, in terms of history.
They fear that this current admin and future adminis may depend on as sec Clinton put it “droning” people we simply disagree with rather than actual military adversaries.
The main question is effectiveness of the system, given some baselines.
> Like it or not the world is full of extremists who would like nothing more than to hurt innocent people.
"Full of"? The world is more peaceful than it's ever been. Extremists do hurt innocent people, and we should not ignore them. But with each choice that we need to make, we should carefully consider pros and cons. Is there really a net benefit here?
Do you read of the sectarian violence that happens in Iraq because Saddam fell? How about increase violence in Mexico after cartel leaders are sent to prison/killed?
I agree with you the world is much safer and there is less deaths from military conflict post WWII. I think that is largely due to the massive military power in a largely benevolent country like the U.S. I think if you magically removed the U.S entirely from the picture other nations would be thrown into conflict to be "top dog".
> Pinker presents a large amount of data (and statistical analysis thereof) that, he argues, demonstrate that violence has been in decline over millennia and that the present is probably the most peaceful time in the history of the human species. The decline in violence, he argues, is enormous in magnitude, visible on both long and short time scales, and found in many domains, including military conflict, homicide, genocide, torture, criminal justice, and treatment of children, homosexuals, animals and racial and ethnic minorities. He stresses that "The decline, to be sure, has not been smooth; it has not brought violence down to zero; and it is not guaranteed to continue."
Every company is involved in killing. Every industry supports the military in some way. Name an industry that the military does not utilize the technology of.
There's a big difference between tangentially supporting the military because your widget (e.g. toothbrushes) is used by them and many others, and making a full on weapon system for them, and only them.
> Would those angry Googlers be against surgically killing Osama?
Probably.
I can't find the quote, but I was reading something about the troubles with the IRA, and the response, that really stuck with me. The author said something like, "When you use lethal force against terrorists it lets them feel that it's fair to use it against you."
Once you make the mental flip it becomes really easy to think up defense systems that work without causing any casualties or deaths at all, not our guys, nor civilians, nor the enemy. And if we just can't live with that, we can always kill them later: https://en.wikipedia.org/wiki/Saddam_Hussein#Execution
thats terrible reasoning, most likely formed by the skewed view that targeted assasinations are needed and cool. Why does the US need to be at war all the fucking time.. there are so many problems with drones doing targetted killings, and so many many more, if they should be powered by AI that get cucumber right 98% of the time. This is such a horrible, horrible, horrible idea.
>Shit, just look at the time Osama bin Laden could have been bombed with a tomahawk missile during Clinton’s presidency. He didn’t do it because of the potential to kill a Saudi prince he was meeting at that time.
Here's an article on that very thing. It didn't mention anything about a Saudi Prince though.
But your points are valid. Like anything, it could be used for non-objectional purposes. How morally objectionable do you think the whole of the use of this technology would be, given the country using it, etc? Do you think it would help the US and allies become more or less authoritarian? How long until you think it would be given / sold to local police departments like other military equipment?
There were many missed opportunities. But I'm mainly pointing that one out as an example of what could be done with better tech[0].
As far as your other points, I think they are all valid. In this country we still have the press to bring up rampant abuses, and other branches of the government to keep the executive in check. I suppose as long as the other branches are willing to keep each other in check, then its better for everyone to include US citizens.
You also have to remember that the people in the military aren't robots (not yet at least). They are regular citizens joining a volunteer army for a multitude of reasons, many of which are to serve honorably and of course the steady paycheck and college benefits. Most of the military does a 4 year stint and they go back to being civilians like you and I.
It was also a staggeringly different time (pre-9/11). I don't remember being as militaristic as we are today. I mean people talked hawkish, but it seemed much less likely to actually happen.
>You also have to remember that the people in the military aren't robots (not yet at least).
I'm not sure what the point is on this comment is, so excuse me if I missed it, but I would think the fact that we have real citizens in the military would be a barrier to atrocities. Putting a drone hive/squadron in the hands of just a few people seems like atrocity would be more likely to happen.
And the internet has allowed for much more "targeted surveillance" too, from across the world.
Guess how it's been used? It's been used to "target" everyone. Why? Because it's gotten cheap and easy enough to use on many more people at once - just how automated drone strikes will be soon.
I think you're naive if you think this will "improve" war conditions. Here's one story that may bring you back to reality, and about how these automated drone strikes are more likely to be used in the future:
The problem with dangerous technologies isn't when they're used under the best possible circumstances. It's all of the others. Nuclear bombs can propel spaceships or dig canals, but nobody is protesting that.
> the world is full of extremists
> Osama bin Laden could have been bombed
What you seem to be saying, in what we hear all the time, is spending money and doing work that increased the power and capabilities of the US military will makes us all better off.
You are also seem to be saying that Osama bin Laden is an extremist.
But in the 1970s, as Afghanistan was working towards becoming a more secular society, the US military and intelligence agencies were arming Osama bin Laden and his fellow jihadis, the proto Taliban and proto Al Qaeda. Who wanted, among other things, for the secularization of Afghanistan to stop, and for an Islamic dominated government to come in. An effort which the US succeeded in, along with their partner Osama bin Laden.
This being the case, I am not exactly sure when bin Laden and people like him became extremists. I suppose it was after the US began it's military occupation of Saudi Arabia. Osama bin Laden opposed the US military occupation of his country.
This may sound equivocal about bin Laden, but the US is more equivocal about bin Laden. I think he never should have been armed by the US. People of a like mind said as much then. Others disagreed.
In other words, if the US is making political errors (or is not making errors and is pursuing negative goals), more power and capability to carry out those erroneous policies will not help matters.
For example, Trump just escalated the conflict in the Middle East this week, which only satisfies religious fundamentalists. Handing him more power to do so will not help things, it will just mean more 9/11s in response to the blow he just landed against Arabs/Muslims.
> I’d like to offer the point of view that if the drones become better and more surgical in their precision, it would reduce civilian casualties.
I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
Maybe I am too cynical, but I genuinely think that the military is only willing to invest in technology that would help them expand current and future operations, disregarding the impact that these will have in the civilian population of foreign territories... probably because it's orders of magnitude cheaper to just pay someone to write a public statement denying every statistics published by neutral NGOs around the world.
This is outright false. I'd love to see what history you are getting your facts from. I can personally tell you I've seen the briefs and classes the military personnel are put through, and they are clearly told to minimize civilian casualties even to the harm of U.S personnel.
They are also told by a lawyer during those briefs essentially "if you break any of the Geneva conventions or outright any of the things we just told you we will swiftly punish you".
>I am not sure why some people instantly assumes that the whole purpose of making drones more autonomous is to make them more precise: historically, the DoD/US military have made virtually zero efforts to even try to reduce civilian "casualties".
????
That's not even close to true. On any level. You are confusing the inevitable willingness to tolerate civilian casualties with an outright disregard. Civilian casualties have consequences, and they are avoided. Not strictly, but it is a cynical fantasy to imagine that there is a complete disregard for civilians.
I wouldn’t say that developing military hardware necessarily negates the “don’t be evil” principal (especially if the developed articles are dual use, for both military and civilian applications). The western world, our principals and values have prospered for more than half a century in Pax Americana afforded, in a large part, by the prosperous US Military Industrial Complex.
I totally get the objection to developing combative AI - that’s a separate ethical question - but you can contribute to the military and still maintain your humane values.
Given that the US military has killed nearly 4,000 people with drone strikes in Pakistan over the past decade, a country in which no formal military conflict exists, nor any formal enemy, just vague accusations of terrorist networks (and likely a bunch of political dissidents fed to them by the Pakistani government), I am really wondering where the "Pax" is coming from. Because if you do the math the odds of any given person knowing someone who was killed by a drone strike, or someone who knows someone who was, are pretty damn high. The US has brought hell to Pakistan.
It's very complicated and has no good answer that would be acceptable to everyone. Pakistan is a hotbed for terrorism and radicalism; the strategy of drone strikes is not only intended to kill "ticking bombs" but also to destroy organizations and keep them busy while being constantly on the look out. Much more difficult to carry out another 9/11 when you need to sleep in a different place every day, or when your most senior operative is 23 because all of the seniors before them have perished.
It's hard to argue how effective this tactic is, being that:
a. most everything relating to this is classified
b. it's very difficult to assess how many terror operation were prevented by those actions, even if you have the classified data above.
An amazing book on this subject of state sponsored assassination I advise anyone to read is Rise And Kill First, by Ronen Bergman, detailing Israel's assassination policy from operational, political and societal perspectives - truely fascinating.
> Given that the US military has killed nearly 4,000 people with drone strikes in Pakistan over the past decade, a country in which no formal military conflict exists
Pakistan is a country with which no formal military conflict with the US exists, but it is not one in which no formal military conflict involving the US exists.
The 2001 AUMF is a (ludicrously open ended) conditional exercise of Congress' Constitutional power to declare war, and the parties targeted in Pakistan are parts of groups to which the executive branch has determined that the conditions in that act apply.
If you're going to say that the US has brought hell to Pakistan, you should at least back it up with some reasoning. What would Pakistan look like right now without the US killing those 4000 people? What would surrounding countries look like? Why did the US kill those people? Would any other country in the US's position have reasonably killed them?
It's a complex issue that can't be summed up by saying "fuck that".
Just imagine your friends' mother was killed as collateral damage in a drone strike. Or your cousin. Or your friend in school. From a missile from an invisible unmanned machine in the sky. You don't know why. You don't if they were a terrorist or mistaken for a terrorist. They were killed by a foreign government that is colluding with your own corrupt government and you have no recourse within the law, can't even fight back. Yeah, it's like that.
Would you really like to justify killed 4000 persons of unknown status with zero legal proceedings because of...opportunity cost? Because Pakistan might look different if they weren't killed? I am not sure I could keep a straight face through that one. No. Just no.
And yes, missiles from the sky killing people: FUCK THAT. Rule of law. Round up those "terrorists" and interrogate them, charge them with crimes. I have no forgiveness for psychos on any side of this that want to push us back to barbarism.
As the number of police shootings (with a high number of collateral ones) is about 1k per year over the last decade American police have killed 10,000 and America doesn't have large tribal area's and the related problems like Pakistan.
Kudos to those who is ready to stand for their principles.
If you are at G and thinking whether you should resign or not, remembers this - the market for AI talent is super hot. You will immediately find lots of great and challenging AI work pushing humanity forward
You will immediately find lots of great and challenging AI work pushing humanity forward.
They'll find jobs quickly, that's for sure.
But "work pushing humanity forward"? There's precious little of that in any skill sector - not at FAANG salary levels, anyways. The vast bulk of the work that the vast majority of us do is simply about pushing the investor's balance sheets forward - not "humanity".
Corollary: plenty of skilled engineers with fewer moralistic constraints will jump at the chance to do interesting work for high pay. For a company as large and wealthy as Google, they can continue to raise offer salaries until they are adequately staffed.
There is a school of thought that recommends “moral” people doing “immoral” work because if those people left then other “immoral” people will take those jobs and more readily implement “immoral” features. So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing.
Military drones are here to stay, and whether or not the US builds them, other military powers certainly will.
These two comments, the parent and grandparent, capture the essential challenge of integrity in engineering. You can have integrity and choose not to work on things you feel are morally wrong, or your can choose to set aside your reservations and continue the work. It is not an easy place to be and it causes many people stress.
I don't know if my experience is typical or not but I do know that several times in my career I have encountered or been put into a position where it was clear that 'success' was tied to doing something which I felt was also wrong. I have always chosen not to set aside my principles for that success.
And still I know people who have made the immoral choice and reaped the rewards, and then they have used that success to step into places of higher influence or control. They would no doubt argue that they were in a much better place to do good now, because they chose to do something wrong once before.
It is not surprising that this conflict is the underpinning of many dramatic stories.
The military industrial complex is a huge employer, both in the state of California and the rest of the county. The likelihood of enough people saying no to a high paying job like this is effectively zero.
And even if - wave a very large magic wand - every AI/ML engineer in the United States pledged to not work on military applications, the U.S, would just contract that same work out from the U.K., Canada, etc...
There is a limited amount of total work that can be done by those people. Every low-level employee who refuses to work on this decreases the overall capacity. Every manager who has to deal with recruiting new people and getting them up to speed reduces capacity. Every company with reduced ability to compete for bids reduces capacity. Pushing the work to foreign companies and getting more red tape involved reduces capacity. It's not zero.
> "So the “moral” engineers have an incentive to stay and act as a front line against “immoral” actions, or at least have an insider’s position for whistleblowing."
This is true if the engineer is in such a position of power within the company that he can effectively influence things towards the ethical goal. Most of typical for-profit corporation engineers who care about that ethical goal have no such position. If they strongly disagree with what they contribute to, the best decision for them and the ethical goal is to quit.
Do you think there is a moral difference between the person who carries a gun in the military and the person who designs it? It seems like whatever feelings, positive or negative, we have about military personnel should carry at least in part to weapons designers.
But now the "immoral" people are no longer doing whatever immoral thing they were doing before going to Google. Fewer immoral things are being done. It is a win.
Hard to tell if those people were planning on leaving anyway, and just wanted to make a splash. How many people quit each week in a company with 90,000 employees?
Only a small percent of engineers at google are AI experts. Sure, more people use it, but they probably just make a service call and get some magic results back.
Definitely scary, but what happens if enemy states create this technology before us? That's why we need a defence sector that does, in fact, build these tools. If both sides have the technology then a stalemate is reached. Look at nuclear weapons; we can all bomb each other out of existence, and so, no one does. Now look at the countries that lagged behind.
This video is fictional, and was made as a warning about where current tech could take us. I have no doubt that generals who have seen it are quite upset that they can't have it yet.
Thanks. I thought I had found the original upload after sifting through a lot of copies on youtube. Wish youtube was a little proactive with detecting dupes, rather than reactive. Or at least some sort of an automatic note/link to original even if it is a legal derivative.
For years we have known that US drone strikes can very accurately kill anonymous people - often civilians (such as women and children):
"Every independent investigation of the strikes has found far more civilian casualties than administration officials admit. Gradually, it has become clear that when operators in Nevada fire missiles into remote tribal territories on the other side of the world, they often do not know who they are killing, but are making an imperfect best guess." [1]
"Leaked military documents reveal that the vast majority of people killed have not been the intended targets, with approximately 13% of deaths being the intended targets, 81% being other "militants", and 6% being civilians." [2]
"strikes have killed 3,852 people, 476 of them civilians. But those counts, based on news accounts and some on-the-ground interviews, are considered very rough estimates" [1]
Not only that, we bomb inside of countries that are (sort of?) our allies, without informing them and without their consent:
"Pakistan's Prime Minister, Nawaz Sharif, has repeatedly demanded an end to the strikes, stating: "The use of drones is not only a continual violation of our territorial integrity but also detrimental to our resolve and efforts at eliminating terrorism from our country" [2]
Well, you should have a problem with calling them evil based on a few actions that you consider evil without taking all of the other things the US has done into account.
Doing one evil thing doesn't necessarily make someone evil. Doing a few evil things doesn't necessarily make someone evil either.
If a doctor who has saved thousands kills one person, are they evil? What if that person was a convicted child rapist? What if it was in self defense? What if killing them would save a thousand more? What if killing them would save 10,000 more but the doctor doesn't care about that and would have killed them anyways?
Deciding whether a single person is good or evil is a very complex process. Deciding whether a country or a military is good or evil is enormously more complex and needs to take a lot more into account that you just did.
They are intentionally bombing targets when they don't know who the target is. They don't know if it's a civilian, but they kill 'em anyway. 87% of the time.
That's not what I said, that's what you are saying, without evidence to back it up I might add. I doubt you are privy to military briefings where they make these decisions. Your argument is quite childlike.
There's a long list of atrocities in human history that have been committed with the best - or at least, good, or at worst, neutral intentions.
Perhaps we should judge organizations based on what they actually do, as opposed to what they intend to do, or what they tell themselves they intend to do.
Do you know why there is a difference between Manslaughter and Murder in the legal code? For the same reason that your statement is absurd. If you don't understand that then we have nothing else to discuss.
You seem to be suggesting that the fact of one's residency in a country should influence one's moral evaluation of that country, that there should be a different set of standards applied to one's own country than a foreign country. I believe that standards of good and evil should be applied uniformly to all nation states. If anything, one should hold one's own country to a higher standard since you have the most hope of influencing your own nation's behavior.
Arguably, it is precisely the inability to consider international conflict from a vantage point outside a nationalist worldview that is the root cause of all external wars.
It would be puerile for a Swedish person, far from it for an American. The fact that it’s the military of my own country has little to do with how evil it is or isn’t.
Watching too many movies and too much TV has the converse effect - for the most part, visual media is full of nationalistic ra-ra nonsense.
The US military (As have all other militaries that have done anything of note) has done some ridiculously evil things. The drone strike program isn't the worst of them by any means, but it's not its brightest moment, either.
F16's are used in many countries. They were developed in the US, but many, many countries use them (some of our NATO partners since the beginning)
Since they were first developed, NATO partners: Belgium, Denmark, Netherlands, Norway
Other European countries as soon as the mid-80's: Croatia, Greece, Italy, Poland, Portugal, Romania
Middle east since the mid/late 80's:
Bahrain, Egypt, Israel, Iraq, Pakistan, Turkey, Jordan, UAE
Africa: Morocco, Indonesia,
Asia: Singapore, South Korea, Taiwan, Thailand
South America: Chile, Venezuela.
> Thinking that the military of your own country is "evil" seems a bit puerile to me.
How true this is in a specific case depends on what that country’s military is doing at the time; categorically dismissing the idea that a nation's military can reasonably be seen as evil by a citizen seems more than a bit naive to me.
In 2015 Alphabet changed their motto to "do the right thing". If you interpret "right" as "correct", then "doing the right thing" can mean 'correctly' bombing civilians.
You'd think that people would know that concepts like "good" and "evil" are complex and somewhat subjective. There are plenty of times when you can kill someone and not be considered "evil".
At the very least, most people agree that killing in self defense is not evil.
It's not directly killing people. That's like saying helping improve GPS satellites kills people because weapons systems use GPS and would benefit from increased accuracy.
Those people would almost certainly been targeted by manned jets. The US still sends out tons of manned combat missions (as many as 20 sorties a day from a single carrier according to this article: https://news.usni.org/2018/03/19/u-s-evolving-middle-east-op...)
> Those people would almost certainly been targeted by manned jets.
I doubt it, considering the cost of manned jets and their limited time-on-target, political fallout, risk to pilots, etc. Drones are stealthy, silent, cheap to operate, have long time on target, and have both surveillance and attack capabilities. Drones are the key enabler for this completely unprecedented type of pervasive, constant warfare.
People who want to make drones "more efficient" at this job are totally missing the point. Making them even more efficient just means they'll be even more widely deployed, which makes the whole problem even worse. The wider deployment completely dwarfs any savings from "less collateral damage". We need to stop drone warfare, not make it better. Return to the rule of law.
Which is a very plausible assumption, considering dropping traditional bombs is much more inconvenient for the killers. With drone attacks, the cost and political fallout is minimal. And with more meal available, the appetite grows - the killers will use the new capabilities to do more killing.
I wish there was a legally-enforcable version of Douglas Crockford's "Good, not Evil" license. I haven't released any source code that could have military applications yet, but if I ever do, I want to make it 100% clear that it's not to be used for any task related to the killing or injuring of other people. We're in a unique position as programmers where even the tiniest bit of our code can affect thousands or millions of people across the globe, and this terrifies me.
The GPL has already shown us that a license has the power to change culture and behavior (in however small a way). We should be able to extend this approach to other values we hold dear.
That's fine; they can use someone else's code. I just want the power to license my software in line with my own values. I'd find it very hard to live with myself if some code I wrote ended up being used to fire missiles at people, even if they "deserved it".
I just want to congratulate those Googlers. It isn't something usual to quit a good job due to ethical concerns. The World would be a lot better if there were more people like you.
Somebody else will fill the gap. It is simply a consequence of military science: If a human in the loop makes combat systems less effective, then other countries will seek the advantage over others by removing the human from the loop. It's a classic arms race at this point.
... This is a pandoras box that has already been opened I am afraid.
I've been asked to work for military industry companies before. And I have always declined for ethical reasons.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
As another poster wrote, it's "good" that the targetting gets more precise, meaning less collateral damage.
But to each his own. We need to be able to sleep at night aswell. And that to me also seems like a really good reason to decline.
I'm kind of on the fence about wanting to work in that industry.
> I've been asked to work for military industry companies before. And I have always declined for ethical reasons.
But as I sit here and think about it, I wonder if its a good thing that a person like myself (that believes I'm on the ethical high-ground) decline these types of jobs.
>
> Someone is going to take the job. Perhaps someone less skilled than myself, perhaps someone less ethical than myself ? What is the result of that ?
Whenever I find myself in this line of thought, I always remember: I am not that special. The impact that anyone has on a workplace is small and mostly inconsequential--more important is group momentum and culture. The likelihood that you'd do bad in a bad workplace is much higher than that you'd be able to stand fast and do good--the world just does not work that way.
It depends on how much of an impact you'll think you can make, I'm always of the mind that these defence contractors primary motivations are what their customers desire (i.e. military), rather than making considerations for anyone on staff.
I guess when you work in an industry where what you create or do is done for killing, as a person, you must have crossed some line of what is okay to do for yourself. Similar to people who remote control 'drones'. And at that point, I would think your will to try and change what the company is doing is very small.
On that note, I jumped back off the fence. Not for me.
I would be curious to see the list of employees who quit for this purpose. They are making a statement. They might as well publicly disclose their identities to inspire more people.
Also, wondering. Are most of them financially independent to have made this decision? When money is not an worry, people have freedom to truly align themselves externally with their internal core values. If you are constantly worried paying rent or securing your kids future - people make compromises. That is not ideal to build a great society.
People who disagree with you often disagree with each other as well. This doesn't come through in a media (of all political leanings) that tends toward a narrow set of well-funded viewpoints.