Hacker News new | comments | show | ask | jobs | submit login
Google Workers Urge C.E.O. To Pull Out of Pentagon A.I. Project (nytimes.com)
1368 points by s3r3nity 82 days ago | hide | past | web | favorite | 1061 comments



From the movie "Flash of Genius" about Robert Kearns, the inventor of the intermittent windshield wiper:

"I can't think of a job or a career where the understanding of ethics is more important than engineering," Dr. Kearns continues. "Who designed the artificial aortic heart valve? An engineer did that. Who designed the gas chambers at Auschwitz? An engineer did that, too. One man was responsible for helping save tens of thousands of lives. Another man helped kill millions."

"Now, I don't know what any of you are going to end up doing in your lives," Dr. Kearns says, "but I can guarantee you that there will come a day when you have a decision to make. And it won't be as easy as deciding between a heart valve and a gas chamber."

To me this is incredibly valid for Silicon Valley engineers these days.


As a roboticist at the beginning of my career working on drones, I decided then and there that I would never make "bombs", a metaphor I used to mean anything that could be weaponized. I realized a lot of the work I was doing was funded by DARPA, and I was very cognizant about my research being used in this way. And like Dr. Kearns suggests, it's not entirely black and white. Would my path planning algorithm be used to more efficiently deliver scientific payloads to the atmosphere, or would it be used to route missiles to maximize casualties? Hard to really say, but I've avoided overtly military applications (even things like BigDog, designed to carry equipment for troops).

Sometimes the distinction is even more insidious. I did work on perpetual flight for drones, and Facebook had a perpetual flight project that had the goal of bringing internet access to remote locations in Africa. Sounds humanitarian, but I also didn't want to be responsible for subjecting poor Africans to what I consider Facebook's panopticon. Or maybe it would have been a net boon for the region? It's really hard to tell a priori, so the best I have managed for myself is just to try and stay in theory, where developments are further removed from direct consequence.


As an engineer, founder, and investor I'll note that there's an ethical mandate for investors as well; when I started drone.vc I committed as a Quaker to not investing in "offensive" technologies.

Separately, RE: bringing internet to new places, there's reasonable evidence to show that introducing broadband lifts GDP[1], so whatever you may think of Facebook, giving people high-speed connectivity to the whole Internet is probably a net positive. Disclaimer: I helped start Facebook's Connectivity Lab but no longer work there.

[1] http://pubdocs.worldbank.org/en/391452529895999/WDR16-BP-Exp...


Please note that there's a sharp distinction between bringing the Internet to new places and bringing "the Internet" aka Facebook's walled garden to new places.

From a wikipedia section on Free Basics https://en.wikipedia.org/wiki/Free_Basics#User_experience_re...

> In 2015, researchers evaluating how Facebook Zero shapes information and communication technologies (ICT) usage in the developing world found that 11% of Indonesians who said they used Facebook also said they did not use the Internet. 65% of Nigerians, and 61% of Indonesians agree with the statement that "Facebook is the Internet" compared with only 5% in the US


Most people on this site have not trusted FB's intentions with the drone project for good reason. Ever since we collectively realized a three years ago that FB's internet.org was a wolf in sheep's clothing, most people on here have been distrusting of FB.


Had the exact same predicament. Struggled with it morally, until I was exclusively working on offensive technologies ("cleanly isolating and neutralizing targets via unmanned air vehicles, out of line of sight with no gps"). That's when I quit and got some of my morality back by joining a quant firm that makes money off of other people purely for the purpose of making money and nothing else.

No wonder I have such an alcohol problem...


Almost any technology can be weaponized. It's not about the particular technology you work on, but who it's being built for and the intended application.


please help me understand how a substance that turns sand into soil, developed to green the deserts, can be used as a weapon.

To me this argument that "anything can be used in evil ways" is a poor excuse used to distract from the issue.


Just because you can't see the destructive nature of a technology does not mean it doesn't have one. Sometimes it's a matter of perspective and context.

When gunpowder was invented the purpose was medicinal, three centuries later we had cannons.

Many things in our world seem positive on the onset, but give it some time and start looking from different angles and you're bound to find some potentially detrimental use.

Granted some inventions like nuclear energy and nitroglycerin, both intended for constructive purposes, have a more obvious destructive potential. An example that better connects with your question is a company such as Monsanto.



I don't think he is


Not OP, but a fertilizer is a reasonable embodiment of something that turns desert soil into something in which plants can grow


Turning sand in to grass - this sounds like a great thing doesn't it?

It's for golf courses - there are amazing golf courses in dubai that required genetically modified burmuda grass in a sandy synthetic fertilized substrate. They take huge leaf blowers and cover football fields worth of sand with it, water the HELL out of it, and pretty soon, beautiful grass in the desert.

Grass in the desert is great for everyone, that wants to golf or make money off golfers.

Getting grass where it isn't sustainable is not benevolent.


Not exactly what I was talking about. And yet I fail to see how a Golf Course is a weapon. Waste of resources, yes. Unsustainable, yes. But weapon?

Actually I was talking about this: http://www.naturalbuildingblog.com/sand-soil-7-hrs-biggest-s...

Regarding the original statement I was criticizing: Of course one can construct an argument of technology itself not having an ethical bias and the notion that everything can be used as a weapon. Sure, beat someone to death with flowers, go on!

The point I'm trying to make is that I consider this tactic a distraction. Nothing more than look! a three headed monkey behind you!


Depends on whether you see this 'greening the desert' as destructive to the environment or not I suppose.

And then perhaps there's a question of who owns this now greened and more usable desert?...


Reminds me of Frank Herbert's "Dune", where the greening of the desert was viewed as ecological destruction (and more), as it destroyed the habitat of the worms.


It somehow makes sand vulnerable to microorganisms? Great, but what would that do to cement bridges and dams?


I can't find a reference but there was a story about a drummer or bugler or something in an army, and when his side was defeated the opposing prince was about to kill him, and he said "No, I have done you no harm! I never bore weapons against you, I only played my instrument so my friends would be encouraged."

And the prince said "by aiding them you killed my men" and slew the man.


And then the prince also killed the blacksmiths, because they made swords, and the farmers because their crops had fed the army, and cobbler because they made boots that people in the army wore, and the lumberjacks because their trees were used to construct siege engine, and the cattle herders because their oxen had been used to pull the army supply wagons, and the printers because their printing presses had been used to encourage the army, and the weavers because their cloth and clothed the army, and the tailors, because they had sewed clothes for the army, and the dye makers because their dye had colored the flags and uniforms of the army, and the tanners because their leather had been used to make the saddles for the cavalry, and the merchants because their money and taxes and paid for the state to have the army.

And then the people in the next town all got together and said, "Let us all work together and see that this prince is never able to defeat our army."


Here's a thought, what if you can get the same substance to work on the sand part of concrete?? It assumes more research and work, but the argument has a kernel of truth.. I don't sure I agree with using that to not think about the ethics of what you do, but nevertheless.


To badly misquote Richlieu,

"Give me six ideas from the most ethical of engineers, and I'll find something in them to weaponize."

Any primary research can be used for any range of good or evil... the task is to curtail the creators of evil uses.


If that technology is controlled by a government, it could be denied to whomever they decide is "undesireable"...


You can't imagine any way terraforming sand dunes on mass scale might do damage to nearby countries? It would be akin to diverting a river upstream which was an act of war historically.


Ah man, as a former straight leg grunt (meaning no plane, usually no helo or even a truck, we walk _everywhere_) anything that can take weight off is an absolute God send. We aren't blood thirsty but you would be saving a ton of knees, backs, frustration, and pain all around.


Yeah, no doubt that tech will help you. But the line from a robot carrying your guns to a robot wielding your guns is a little too direct for me.to work on personally. It's not that I think you are bloodthirsty. I think the people you work for are.


If all the engineers who have a sense of ethic refuse to work for military projects, it implies that military projects will be performed by less ethical than average people. It is frightening. IMHO, avoiding military projects because of ethic is counterproductive.


Richard Gatling thought his most famous invention would cause wars to be shorter and less brutal. More deaths faster would mean faster surrenders, he thought. He is a weapons inventor who thought he was doing something ethical.

Instead, he made war and death cheaper. He made it more likely. In his lifetime, machine guns were deployed asymmetrically, making it cheaper for colonial empires to kill large numbers of the colonized.


Alfred Nobel thought so too with dynamite.

"Perhaps my factories will put an end to war sooner than your congresses: on the day that two army corps can mutually annihilate each other in a second, all civilised nations will surely recoil with horror and disband their troops."


Isn't that what is kinda happening with nuclear weapons though? I would argue we have less armed conflict nowadays exactly for the reasoning in that quote. Dynamite did not really allow for two army corps to mutually annihilate each other in a second, nuclear weapons actually do.


Nuclear weapons didn't stop wars, and they didn't stop major powers from fighting each other. It just moved to proxy wars. Proxy wars have been common for a long time. Maybe they've stopped world wars, but maybe the two World Wars were exceptions and not a guaranteed feature. I don't know how to quantify whether there are greater or fewer armed conflicts between states since the invention of nuclear weapons.


It appears this is repeating trope- these inventors of highly effective killing machines did not consider that the increased rate of killing would be matched by an increase in mobilizing cannon-fodder.

Some genius in the near-future will probably make the same inaccurate prediction about autonomous killerbots, or more likely realistically, cyber-warfare (I hate that term): "Shutting down hospitals, power plants and other critical infrastructure will make wars shorter"


This is essentially the argument for targetting the Iranian nuclear program with stuxnet.


"Hey, somebody's gotta help kill all these people, might as well be me--wouldn't want it to be a bad guy!"


You can talk yourself into working on anything at all with this rationalization.


Yes, even if you are completely against something. For example, if you are against GMO and work on this subject, you may give valuable insights (outside of NDA) to other GMO opponents to help their fight to stop GMO. Of course, you risk your job.


You are not going to change the military from the inside anymore than Snowden could change the NSA from the inside.

Thinking you can is naive at best.


Snowden was inside and his actions have an impact on NSA. I hope these kinds of organisations will understand that if they want to avoid similar whistleblowers, they need to have a better management that listen to their "halflings".


Snowden didn't change anything from the inside. He tried and failed before realising a public-interest leak was the only way.


Damned if we do and damned if we don't?


Perhaps we need a license that excludes use of a technology for certain purposes (?)


Like the patent licensing terms for using AES-OCB?


>"One man was responsible for helping save tens of thousands of lives. Another man helped kill millions."

In the case of Fritz Haber, it's the same man who saved billions from hunger and killed millions by prolonging World War-I. Excellent article on the same - https://medium.com/the-mission/the-tragedy-of-fritz-haber-th...

Sometimes, the decision isn't clear as black & white; especially when you mix nationalism & patriotism into it.


That's totally different. He chose to kill the people and did it intentionally out of hate / pride for his country. This thread and the posted link are discussing whether something is immoral or not. He knew it would kill a ton of people and designed it to on purpose. That situation is totally black and white.

Mass intentional killing is wrong, pretty much everyone agrees with that.


> Mass intentional killing is wrong, pretty much everyone agrees with that

Nope.

Bomber Command in WWII and the use of nuclear weapons in WWII are both examples where you will find nuanced, intelligent debate on both sides of the argument of whether "mass, intentional killing" being wrong or not. What you describe as "hate / pride for his country" ... but isn't that the reason most soldiers kill? That you even bother to call this out shows that you know the reasons for killing people need to be taken into account.

I'm not making a statement on whether or not mass killing is wrong here, simply pointing out what seems obvious: it's far from black and white.


"intelligent debate on both sides of the argument of whether "mass, intentional killing" being wrong or not"

I'm sorry, but this sentence contains already the problem.


> Mass intentional killing is wrong, pretty much everyone agrees with that.

In shooting war, both sides have already ceded that killing people is morally justified.

And now you're in the unenviable position arguing about how much death is too much, and what are the proper ways to kill people in a war setting.

And then you get asked moral dilemmas if killing 100,000 people is okay if it saves 1,000,000 people from dying further. This is the Hiroshima/Nagasaki question in a nutshell.

I don't know that answer, but then I didn't fight a brutal war for four years, watching my friends, family, and countrymen die at the hands of a brutal regime.


That was a thoroughly depressing read, but a most excellent one as such. Thanks.


Why would the engineers be more burdened with the ethics of their creation over the investors, founders, managements, lawyers, execs, CEOs and government policy makers who all contributed to, promoted and made happen its fabrication by engineers?


For the scope of this audience, I don't think it's necessarily a question of who's more responsible than the other. You're right, all the parties you've mentioned are just as culpable for their role in wrong-doing, even if it's abstracted or removed from implementation. As engineers though, we're in the unique position of being the gate-keeper between ideas, "visions" if you will (in the parlance of our times), and fruition. We're a gatekeeper between reality and the delusional musings afforded to people who aren't the engineers. We can say, "Fuck no, that should literally not exist or be built in any way", and do it loudly to stop the other parties from realizing their abusive vision. Or under dire circumstances, we can sabotage systems that need to be broken if it's too late to stop them from being created.


> investors, founders, managements, lawyers, execs, CEOs and government policy makers

Power, corruption and lies.

Those people often stand to personally (financially) benefit from unethical business decisions. Engineers have to think through all the dirty consequences, and still draw pretty much the same salary.


They aren't more burdened, but they are the last line of defence.


Ah.. the causality.. the argument can be made that without engineers the creation would never happen, (even if others are still trying to make money without ethics..).


To me, the salient point of that analogy is that the engineer designed execution chambers for Nazis. The engineers who designed the Sherman tank, the Garand rifle, or the Spitfire fighter plane also built weapons, weapons that killed people, but the judgment of history is that those people needed to be killed and thank God for the engineers who made that possible.

So, there are two aspects of that: first, do you trust the people you're designing a weapon for to use it for justifiable purposes, and second, how does the weapon itself influence whether it can be used for just or unjust reasons? Gas chambers, for example, can only be used to kill someone whom you are already able to coerce into entering it; its only use is to murder a helpless prisoner. A Spitfire or Hurricane can shoot down bombers but not necessarily do that much damage on the ground. Orwell had an interesting perspective on this question: "Thus, for example, tanks, battleships and bombing planes are inherently tyrannical weapons, while rifles, muskets, long-bows, and hand-grenades are inherently democratic weapons. A complex weapon makes the strong stronger, while a simple weapon — so long as there is no answer to it — gives claws to the weak."

I really don't have answers. The stance of, "weapons equals bad, therefore it's wrong to ever help design weapons" is a pacifist notion, and maybe that appeals to you, but if you're not a pacifist, it becomes a complicated question with certain political ramifications.


The judgement of history is always that those people needed to be killed, because winners write the history.


Not always, but I figured WWII would be a relatively uncontroversial example.


>incredibly valid for Silicon Valley engineers these days

If one could engineer a solution to save lives, but instead uses his/her talents to build the nth iteration of, say, a food delivery app, does he/she bear any ethical culpability?

Because it seems the bigger challenge for engineers is not avoiding overt evil, but in the opportunity cost of spending their time on work of significantly lesser consequence than their talents might allow.


I think you're taking the idea too far. look at what the civil engineers have:

https://www.asce.org/code-of-ethics/

Yes, yes, we have a responsibility to not defraud people, and to not work on obviously evil projects, and even to speak out if your employer is, say, building a bridge that is not up to the clearly defined, established standards. (which is super different from when you think a bridge will collapse, but can't explain why in a way that other people can understand. This parenthetical part is particularly important, I think.)

But we are free humans, and if we want to waste our lives on trivial things, so long as those trivial things don't hurt others, that's our choice. We are humans, and humans need entertainment, we need play.


>look at what the civil engineers have

Well, that's a narrow view of ethical guidelines in conducting one's work. We all generally agree that people shouldn't be evil.

My question sits above that. Is it enough for a moral society to simply decline to do evil? Or do we have a responsibility beyond that? And if it's the latter, then whose responsibility is it?


Declining to do evil, with real standards for what that means is, at the very least, a good first step, and a possible one.


Engineers, like most human beings, are free to waste their talent. It wouldn't be nice to demand some kind of moral responsibility from engineers just because they can do more. How they live would be up to them, and free market solution would be far better terms under which they would be convinced to do more.


Yes, we're all free to choose, which might include doing things that are overtly immoral (as in the parent discussion).

My question was whether it's just as immoral a choice to not use one's potential to help people as it is to use one's talent to overtly harm people.


>My question was whether it's just as immoral a choice to not use one's potential to help people as it is to use one's talent to overtly harm people.

Is the answer to that not obvious? It's the difference between negligence and pre-meditation/mens rea.


The suggestion that it might be obvious is central to the question and what makes it interesting (to me).

Because, yes, it would seem obvious.

>It's the difference between negligence and pre-meditation/mens rea.

a) you're using legal definitions when the question is about morality and b) it's not negligence to actively choose one path over another. It's simply making a decision, and I was asking if that decision can be considered a moral one.

Started off as a philosophical question, but seems to have gone a bit literal. Oh well, it happens.


>a) you're using legal definitions when the question is about morality

You don't think the legal differences between those two things were based on perceived immorality?


So youre saying we need another messaging app?


just a reminder: you are writing this over the internet - that one was funded by the pentagon too (darpanet)

http://ccr.sigcomm.org/archive/1995/jan95/ccr-9501-clark.pdf Here they say that the design of the internet was very much determined by the military priorities of the project (Telcos had different priorities so they came up with OSI where the link layer is supposed to be reliable, as opposed to DARPANET/IP, where the link layer/IP routing is dumb and the endpoint is smart; this allowed link layer to be pluggable and IP routers to be cheap)


There is a huge difference between creating a network protocol and designing autonomous killer drones.


I think an even more apt example is that the US military was the prime funding for the first integrated circuits.


Self driving vehicles were originally funded by DARPA with the idea to use them in combat. Now self driving vehicles are our best hope to significantly reduce road kills. And that will generate a deficit in organ donors. Ethics are complicated.


Well, self driving cars will/might make well over a third of the workforce redundant (the brother of my wife is a driver, that makes it personal for me), I think that this has the potential to create major tensions within society. My point is that innovations have the potential to be applied in different contexts. So what?


You're absolutely right! There's a huge difference. One is a general solution to the general problem of how to move data around in an organized fashion and the other is a specific solution to the specific problem of wanting murderbots.

That said, the approach of some of the commentors here is that the only acceptable approach is to refuse to work on anything that could ever be "weaponized". It's possible that the distinction you've so clearly and correctly pointed to could stand to be more compatible with this stance.

Autonomous murderbots are highly likely to depend on network protocols, among other things.


Really? The killer drones couldn't communicate without said network protocols.


The point being that "killer drones" are by sole purpose immoral. You can hardly find a humanitarian use for them, while network protocols are pretty much amoral. You can use them for bad, but also for good.

Please excuse my manichaeism.


One could say the same of atomic weapons, yet it's likely that nuclear weapons have saved more lives that we can even imagine. The impact of war was accelerating in raw numbers, and increasing in relative numbers, all the way up to WW2. 3-4% of the world's population was killed in WW2. In today's terms that would be ~270 million people. Think about that number for a minute. Let's put it in scales of 9/11s. During 9/11 2,996 people were killed. Our scaled WW2 had 90,120 times more deaths. That number is so absurdly large it's difficult to grasp. So put another way that's a 9/11 scale event every single day for 246 years. The United States, as a nation, was founded 242 years to give that number some scale. Just think about that! If each generation has children at 30 years old then another way of seeing that number would be a 9/11 event each and every day from today until 2 years after your great great great great great great great grandchild is born.

But then came the nuke. Nuclear weapons make traditional war between developed and nuclear capable nations basically impossible. Think about what the Cold War really was, or even what the geopolitical 'disagreements' of today are. Those are World Wars 3 and 4, averted because these wars would be unwinnable by any side. Like Einstein said, 'I'm not sure what World War 3 will be fought with, but the 4th will be fought with sticks and stones!'

Did the people developing nukes understand that they would finally create a weapon that immensely powerful that it would deter open unrestrained war for decades to come? It's possible, but I think that their motivation was something more straight forward - gain military superiority in the present, probably also mixed with a bit of scientific curiosity about the challenge of creating a nuclear bomb.

It's difficult to predict the future. The most awful and aweful weapon created in the history of mankind ended up creating the most unprecedented period of peace for mankind as well. This is also why I vehemently oppose nuclear disarmament. That's how you get WW3 if people actually disarm, though in reality it'd simply likely result in nations obfuscating their nuclear weapon programs and facilities. It's just a molten salt thorium reactor guys, come on - breeders are awesome!


Peace for the west != peace for mankind


I don't think you understand the scale of these wars between developed nations. You could kill every single person in Iraq, Syria, Afghanistan, etc and it wouldn't even begin to compare to the relative death toll that WW2 inflicted. And to be clear it is not because these countries are not heavily populated. Iraq, for instance, would be about the 7th largest country in Europe, nearly 4 times the size of Sweden for instance.

The loss of life when developed nations fight against each other is literally inconceivable in today's times which is why I went out of my way to try to give measurements that can help you grasp it. The entirety of the world's conflicts today is completely and absolutely negligible compared to WW2. And I'm focusing on WW2. The major point here is the WW3 that we have avoided. The reason for that avoidance is almost entirely because of nuclear weapons.

Finally, it's not just peace for the west. There is unprecedented peace throughout the vast majority of the world. The entirety of Africa and the Mideast account for less than 20% of the world's population.


I did not say the death toll would compare. Im not arguing for or against nukes. Im only saying that the world today has not achieved peace for mankind.

The school my friend went to as a child no longer exists because it was bombed (Syria)

Wars are still happening today, just not with nukes. Every nation with nukes knows that it would be suicide to launch one.


I don't think minority populations are impressed by how wonderfully other populations are dying less?


Technology is often of dual use - one mans killer drone is another mans life saver. A similar device could be used to get injured people out of harms way, or to deliver aid/medicine/whatever.


Accurately dropping a flotation device onto drowning swimmers for instance: https://www.reuters.com/article/us-australia-drone-rescue/dr...


Would you deny "killer drones" to the armies and civilians fighting the Nazi Wermacht?


I don't see droning ISIS terrorists as immoral, sorry.


reliable and confidential communication is something that decides the outcome of wars. Coordination of the forces is one of the most important factors in military operations.


why? it was created so we could have reliable communications in the event of a total nuclear war, that is, an enabler


Nobody said "never make anything for DARPA". The point is just that you have to think through the consequences of your work. Making a network protocol for DARPA is different from making a casualty maximization algorithm for DARPA.


On the networking side of things: interesting that ROcE, infiniband and FC (depending on service class) all have reliability in the link layer, so the telcos did know something about their priorities.


I do not think that this is a simple black and white separation,but rather a dialectical one.

I used to find myself most of the times confused while using technologies whose CEOs, mission I do not agree with. It felt as if I was contradicting myself. But then I realized the dialectical nature of this relationship which is if you want to create something better, improve something it is only natural that you will do so using the existing tools available in your world.

A great example is Facebook. There are things you might not agree with, you might hate it but if you look closer you will see volunteer groups to political campaigns to marches and protests which are organized only via Facebook. One of such political activities one day could as well change Facebook or the world we live in drastically.

You can apply this to other cases. You might have an anti-capitalist philosophy podcast hosted on Amazon or Spotify or Apple which in essence contradicts the very existence of such companies. But in the end you are using them to deliver your message, cause change etc.

I am curious about what others think about this and how they feel about looking at it from a dialectical point of view as I am continuously thinking about this and still forming my thoughts.


Whether military AI endsup being closer to a heart valve or a gas chamber, it's going to be built first by some nation.

But there is space for ethical contemplation during an arms race.


Absolutely. As consequentialist first move, let me suggest that a first order principle for the US military is "Sweat now or bleed later". Better to work your ass off to develop superior tactics, discipline, and technology now than die later.

No one hates war more than soldiers.


If soldiers hate war so much why do they join up?


The best way out of a terrible life for many poor people.

If you didn't do great in school and don't have rich parents to pay for college, the military in the US will look great with its education perks and chance to "see the world".


They believe in their country, what it represents, and are willing to sacrifice their life for it. In the US, it's the Constitution, to which they make an oath to upon joining.

Or they're broke, have their back against the wall, and can get good training and education through the military.


Possibly because the one thing worse than war is being invaded, history has shown that


Predatory sign-up practices in low-income neighbourhoods, and a systemic and relentless propaganda campaign starting in nursery school that instils jingoist rhetoric and military worship.


To play devil's advocate, we (USA) have a volunteer military. Even if you think it's a "poverty draft" driven by structural unemployment and dearth of opportunities in left behind cities and rural areas it's still not conscription.

How should the military fill its ranks? Would you rather have a lottery?


Same reason most people work on jobs they hate. Gotta work for money to live.


> And it won't be as easy as deciding between a heart valve and a gas chamber

That's the truth.

Did the engineers of the atomic bomb save millions of lives and usher in an era of unprecedented peace? Or did they slaughter millions of innocents and sow the seeds of humanity's destruction?

It's been 70 years; we've had plenty of time to make up our minds.

Many inventions are not so easily classified as innately good or evil.


From the Wikipedia article on Robert Kearns https://en.m.wikipedia.org/wiki/Robert_Kearns

>Kearns was a member of the Office of Strategic Services, the forerunner of the U.S. CIA, during World War II.[1][2][3]

Somehow I don't think that Kearns would have had an issue with wriing software to analyze video footage for the military.


I don't think you can ever blame the engineer who designs the weapon, unless it is obviously ethically problematic like, a better iron maiden (as an example, I know it's probably a hoax device).

Merely offering a force projecting capability is not evil.

If you look at world history, the only concrete definition of who has most political power is this: it's the party with the most capability for projecting violence in a given area.

Small close knit groups that live outside of large populations can live independently and do without military force. Once you have a large population with more or less fixed hearth and home the party with most violence projection capability owns them politically.

This dynamic is played in almost ridiculously minuscule scale in the peloponnesian wars of the greek city states of antiquity.

The pathological manifestation of this principle can be viewed, say, in the rise of ISIS and the Somali warlords.

While a modern state is seldom a benevolent actor, at least in the western countries it's the best of known alternatives.

In this framework violence is a key tool of the state, just like good governance, tax collection, etc.

Given that violence is a necessity, in my opinion designing military instruments in itself is not evil. It's not as good as designing new vaccines, so there is some ethical scale in the matter, but I'm damned if I can put it in concrete terms.


The reason Rome was so influential was its military. Ditto for the US.

Are you sure it's a good idea to become less influential on the world stage?


To me, that's the question. And people don't realize a lot of who we are and where we are is because our military presence. Atomic bomb anyone?

What gets me now, is how much the US economy and focus is on death and violence. Offensive death and violence. We're bombing people all over the Middle East. And have been for years. Tangentially, I find 'March for our Lives' ironic given the amount of, to me, unjustified, violence and death the US country exports to children and families around the world.

For self-defense only? In an ideal world, yea, invent the meanest things possible. That philosophy still is compatible with a strong military. But then again you have Gulf of Tonkin, etc., so who knows!?

Lately I've been telling myself this whole wargame is a game I'll never be privy to what's actually happening.


Is having killing machines with no human in the loop a good idea? because that's the way this is going. We don't need Skynet to have a dystopian future.


Is it ok to not have them and allow an adversary who does develop them to overwhelm your nation's outdated defenses and less efficient tactics, as Germany did with its tank blitzkriegs and diversion around the Maginot line?


The problem emerges when an actor most become a monster to win, sort of like Mutually Assured Destruction. I think that scifi shorts have the duty and power to show us the paths that certain developments entail. For example:

- Slaughterbots: https://www.youtube.com/watch?v=HipTO_7mUOw

- Last Day of War: https://www.youtube.com/watch?v=IjJmTeBSEzU


The reason US is so influential is its the most powerful world's economy and the issuer of world's reserve currency. Having big military is a consequence of that - if you can afford to have big military, you get it. And, also, if you have a big economy, you may need some ways to defend it from all kinds of threats - before they become so serious that they can hurt your economy.


> The reason Rome was so influential was its military. Ditto for the US.

I think people often have that twisted. A lot of the U.S.'s expansion is cultural, not military. That is to say: people liked what they were doing, and became a fan. Think: music, movies, and to some extent, literature, and now you might add software and consumer electronics to that list.

And even when looking at Rome, can you imagine how much harder those repeated annexations would have been if the Roman culture wasn't respected?

I'd say that being capable at least of defending oneself, independently, from most individual powers and believable coalitions, creates a certain amount of stability.


Who can project force like the U.S. military? No one. For the most part it's used to enable people to have more say in how they're governed, modulo certain constraints. That's not to say we do it perfectly but we shouldn't underestimate how positively influential that military power is.


In USSR, before it collapsed, young people dreamt of having jeans (not everybody could get or afford them), listened to Western music and watched Western movies. Not because US army has secretly occupied USSR, but because Western culture was more attractive to them. Not everything is done with the military.


>Not everything is done with the military.

I'm not claiming it is, nor am I arguing that culture isn't also a factor. I'm saying that military presence shouldn't be discounted, as the parent seemed to be doing.


Agreed. And it's a position that I don't think many people realize.


I think you actually have the relationships reversed. Military power paves the way for projecting culture, not the other way around.

Realpolitik is meaningful; and everyone on the left seems to have forgotten about it (I say this as an unabashed progressive liberal).

The US’ prime advantage in the 20th century was the fact that basically the rest of the industrialized world was completely destroyed twiceand then they paid the United States to help them rebuild. It wasn’t until the 70s that the US’ reliance on oil became heavy enough to exploit strategically, but by then we had built an entire post-industrial economy a decade ahead of Europe and two decades ahead of the rest of the world. We helped Japan do the same because they were basically a puppet state and gave us a foothold in Asia to serve as a buffer with Russia.

Does that sound like cultural expansion driving the military, or the military goals driving the cultural expansion?


Yea, but how much cultural influence comes back to the military? Internet? Darpa. Hollywood? CIA pitches/vets certain scripts. Modern art? A CIA weapon during the cold war.

The list goes on and on.


American music and movies, until recently, did not originate in any large part from CIA psy-ops. Popular software products like the ones produced by Microsoft, Apple, Google, etc. don't really have much to do with the defense industry, at least beyond military being the first major customers for computers.

Defense spending is a small parasite on an enormous private market for cultural and consumer products, of which the U.S. is and was a major exporter, respectively.

Added:

Defense agencies drive innovation because they have external constraints which force them to try to procure things which haven't yet been invented. It's similar to the way that automobile races are used to drive auto innovation. Necessity is the mother of invention, and necessity is not unique to defense, but defense necessities tend to produce extreme engineering efforts.


The engineers in both cases are not essential to the outcomes. Engineers or craftsmen, or any other workers execute tasks put forth to them by society that gave them birth and education. Any person in the society has some influence, but clearly more influence is the hands of the leaders and organizers.

But, there is power in organizing, or unionizing, for collective bargaining.


This reminds me of the movie "Cube", which is about a sadistic torture chamber built by accident through a network of government sub-contractors who each built some innocuous part of the thing, which was then assembled through layers of opaque bureaucracy. It's a terrible film, but I always thought that premise was hilarious.

https://en.wikipedia.org/wiki/Cube_(film)


That defense didn’t work at Nuremberg.


It's not a defense. It doesn't absolve anyone of personal responsibility.


> Who designed the gas chambers at Auschwitz? An engineer did that, too.

Well, given how unimaginative a leap the national socialists took from the national fumigation program to the gas chambers, I doubt there was much more to it than "take bug poison room tech, use it on humans".

There's no such thing as an "engineer" in the real world, and there's almost always somebody more unscrupulous, devout, or patriotic than you who knows or is willing to learn what it takes to finish the job, and school is not going to get in the way.

Added: Engineering is a practice, not an immutable identity. Even if you publicly shame all the "engineers" away from contributing to an effort you disagree with, replacing them is a matter of learning enough of the practice to complete the task. Even if we were just talking about what is in paper books today, it's hard to say that anyone could have control over who is or isn't capable of engineering efforts.


> ... there's almost always somebody more unscrupulous, devout, or patriotic than you who knows or is willing to learn what it takes to finish the job ...

If you believe that doing something is wrong, the idea that someone more unscrupulous will do it if you do not must not be used as an excuse for doing that thing yourself. Giving in to this notion will prove yourself to be the one who is unscrupulous.


> If you believe that doing something is wrong, the idea that someone more unscrupulous will do it if you do not must not be used as an excuse for doing that thing yourself.

By involving yourself, you gain the opportunity to do the right thing (for the given the circumstances). If quitting accomplishes the same thing as sabotage or internal lobbying, then by all means.


In 2018, the following statements are true:

1. If you're on an active battlefield, both sides have already agreed that killing people is morally justified.

2. If the enemy is shooting at you, he is not interested in the ethics of killing someone on a battlefield.

3. If ethics helped wars end, there would be a US Army 27th Ethics Brigade to parachute in, but there's not and probably never will be.

If on a battlefield, given a choice between holding scrupulous ideals and living, most people would choose living.


What's the point of life if you are forced to be evil, unquestionably and irrevocably? Why so many vets have PTSD and end up doing crazy things?


You have it backwatds.

If someone wasn't willing to kill someone else on your country's behalf, most likely you wouldn't have the freedom to ask that question.

Edited to add:

The metaphysical question you posted might be better phrased to ask:

What's the point of life if someone else is willing to extinguish it?


I completely support the employees but it is criminal that this NYT article does not mention the extensive roots of Silicon Valley in Pentagon funding. SV as a whole is substantially a creation of military spending.

Just recently in the AI space:

- Siri was spun out of a Pentagon project -- look up SRI International and CALO. Its purpose: a "soldier’s servant".[1]

- Autonomous driving is a direct evolution of Pentagon-funded efforts -- see DARPA Grand Challenge.

And it's not just funding the early research, it's procurement like this too. Military procurement has also supported the development of technologies when the commercial market couldn't.

Again, I support the employees and hate the fact that in order to develop medical lasers we first have to figure out how to shoot down missiles with them. It's hugely inefficient, could spell our doom, and if you think about it, fundamentally undemocratic. (Gives elites more power to direct taxpayer dollars under the rubric of defense.) But this should be a basic part of any story on how Silicon Valley works.

And to think that SV has a large population of supposedly "small government" Libertarian Capitalists... oh, the mental gymnastics in that.

[1] https://en.wikipedia.org/wiki/CALO


I can really recommend Waldrop's The Dream Machine [1]. It is a pretty detailed description of the different driving forces that let to the Internet, personal computers, AI research etc. Yes, a lot was influenced by the US military and DARPA, starting with RADAR systems and the need for computation in WW2 and continuing with the cold war, the Sputnik shock and increased research spending afterwards. On the other hand, a lot of the folks involved certainly had no militaristic attitude or intentions. A lot of government funding early on sparked an amazing development, the foundations of the things we work with today.

[1] https://www.amazon.com/Dream-Machine-Licklider-Revolution-Co...


> SV as a whole is substantially a creation of military spending.

This is a national budget problem. If you're in the tech world long enough, it becomes pretty clear that the only way to get Big funding is through military affiliation.

This puts a huge selective bias on what kind of technology projects actually get big funding, and further it prevents the benefits of those projects from reaching the community for years, because the military overlords demand secrecy sole use of the technology until it gets superceded.

We need to cut a huge chunk out of the military budget and give it directly to the tech sector, so that big innovative projects are actually possible without having to be military.


>This is a national budget problem

Lookup "the secret history of silicon valley". There was no tech world in the bay and massive funding for radar research post-WW2 bootstrapped what is now silicon valley.

It's not a national budget problem, it's just the history of why things happened in SV.


Doesn't the recent (last 10yrs) VC splash do that? Hard to tell numbers since military spending has so many routes, but i'd love to see numbers on the two channels.


No, they are not parallel channels. Generally speaking high tech develops over a multi decade timeframe. Govt agencies like DARPA play a lead role in the earliest phase (often measured in double digit years). VCs pick outputs that have commercial potential and pour money into the sector.

That’s why it’s wrong to just compare the absolute amounts invested — it’s when it’s invested.


It starts well before Silicon Valley, Engineering Departments get large portions of their revenues through Defense programs.

MIT's Lincoln Laboratory made 27% of their revenue (roughly a billion dollars) in 2016. That's one research center.

http://web.mit.edu/facts/financial.html


"Engineering" as a concept itself has roots in military application. The term "civil engineer" was coined precisely to distinguish the application of engineering techniques outside of the native context of military works.


Yea, Leonardo da Vinci was a military engineer at one point.


I bet that number gets a lot bigger if you're willing to consider donations from the defense industry as well. I seem to remember Raytheon and Lockheed Martin's logos displayed on the donors wall in the Huang Engineering Building at Stanford.


Steve Blank has a great presentation on the "Secret History of Silicon Valley" and the military roots: https://www.youtube.com/watch?v=8uA2bLrl_9Q


Internet was a creation of DARPA. And that D stands for Defense.

I believe the silicon research/production that started Silicon Valley was for military needs. Military paid for it.

First Computer of US and Britain? It was funded and created during war for military purposes.

And what about China and Russia? Will THEY stop AI research just because Google employees demand it?


Neither Ford nor the Right Brothers were military. Xerox PARC wasn't. Tesla isn't.

Innovation doesn't need defense support, innovation will come anyway. We just choose to allocate much of our resources to the military so that's where it gets spent. Later we can claim it as a win for defense. A happy surprise benefit of trying to kill each other.

All you need is people, time and money. Profit.

Many years ago I choose not to work on a local defense project. I found out that the project had ties to Chinese defense. I'm fine with my choice because I find their leadership somewhat oppressive.

Another time I saw Iranians on the campus. Another awesome opportunity for interesting work that some might not want to get involved in.

I know someone who did work for Mugabe. Hey, cool project. It's easy when you justify it.


To your knowledge.

Yes, you need time money and profit. Agreed. But if those things go against the military's goals, or you're unable to convince them otherwise, you're in trouble. At a certain scale you need to have the nod of those in power. It's a fact of reality on this earth. I'm not saying it's good or bad.

And I don't know how millions being lifted out of poverty, 10000 miles of high-speed rail in ten years, and so on, in China is necessarily oppressive. Or how the sheer virtue of someone being Iranian makes the project bad. I'm getting a lot of black and white vibes.


> to your knowledge

Well yeah but good luck proving PARC was a secret defense contractor. We can cherry pick examples of tech or benefits all day. A few good defense ones doesn't mean defense is good.

You can have an oppressive government and still have a lot of good come out of it. China is amazing. Their people still aren't free. In fact you can have a murderous military and still have benefits to humanity emerge. The military is still there to kill people and I'd rather not be part of that.

Projects were chosen as examples of countries that are American rivals, for effect and the benefit of US readers. Fun well-funded Chinese research projects might one day kill you. I would be as unhappy working on a defense project for any country, including my own. the point was "someone on the other side might have the same justifications".

We'd all be better off if the world sent more defense money to other parts of our lives. It's not the defense aspect that makes the projects good or possible, it's that defense has our money. Send the money directly to tech or research, same benefits would emerge.

We can't change our countries' budgets but we can stop idolising the military as funder. Sure DARPA but they hooked up a few universities. Pics of cats and porn (probably not military but good luck proving otherwise) took us the rest of the way.

> Black and white vibes

Play the ball not the man


While you are certainly correct that the military has provided the genesis of various valuable technologies, it doesn't have to continue to be that way. We could choose to expend our limited resources on more meaningful, and socially responsible, outlets. This move by Google employees is to me a welcome tug against the hopelessly chaotic military machine, and a push back against continued military involvement in SV affairs. I'm happy to see individuals taking this position.


I'd give such a political movement a 75% chance of being able to disrupt research funding from DARPA and a 5% chance of being able to replace it.


I think that's a great idea. And I support it. But something inside me says it's idealistic and not in touch with a baser human nature of something akin to a 'dog-eat-dog' mentality.


Just because the United States has decided in the past that the only way we are allowed to fund tech R&D is through the military budget doesn't mean that it is the only way it could possibly be done.


No one said it's the only way, but it also doesn't mean military funding should be shunned. As long as it's handled responsibly and has well defined limits.


But the problem is that is cannot be handled responsibly and in well defined limits.


But it has many times in the past.


The internet itself is an evolution of darpa net.


And that means that I have to make anything that DARPA asks for from now until the end of time?


I mean, if you want to go back far enough, many major results in optimization theory and operations research started in one form or another as a project to aid with military logistics. Now those are fundamental tools inside and outside SV.


Nazis gave us Volkswagen. I don't have to support Nazis to own a Porsche in 2018.


Because no part of the product was developed during that time. Just the root of the parent company is from that time. For SV it's different. The product itself was funded by the military.


Agreed. NYT is drivel, table scraps us plebes get to the actual game that is being played.

I've called SV the military's generalized R&D department.

deft 81 days ago [flagged]

Libertarianism is an ideology that relies on mental gymnastics.


Every ideology oversimplifies the world. Economics are complex and humans can't reason about emergent behavior from societies of other humans very accurately.

However, if you think an ideology is has logical errors and requires mental gymnastics, you likely don't understand it well enough to intelligently criticize it.


Please don't take HN threads on generic ideological tangents. It leads to generic flamewars which are basically all the same and therefore off topic here.

https://news.ycombinator.com/newsguidelines.html


ORACLE was a CIA contract, and you can imagine what they did with that probably would make your stomach turn if you knew every detail.


> SV as a whole is substantially a creation of military spending.

That's true, but not really in the general way you argue. Military spending on bay area R&D has been more or less insignificant for the entirety of the internet era. It exists, sure, but it's not driving meaningful revenue for any of the big players. Silicon Valley since the mid 90's has been a consumer thing only.

And even in the genesis of the valley, it was just one serendipitous application (missile guidance systems, which were willing to pay huge sums for early transistors which were literally 100x lighter than vacuum tubes).


> Military spending on bay area R&D has been more or less insignificant for the entirety of the internet era.

This is ridiculously false. SV receives billions of dollars from the Pentagon, the CIA, and a variety of other government agencies annually. Here is just one example:

https://www.bloomberg.com/news/articles/2018-03-09/peter-thi...


And that's just direct grants. The bigger role is, DARPA nurtures core tech in its earliest stages when commercial viability is still a decade or more off. The "big players in SV" are picking the fruit and bringing it to market, often for nominal technology transfer fees if any. It's like passing the baton from taxpayers to VCs -- but guess who keeps most of the profit.


You're having a Dr. Evil moment. "Billions of dollars" doesn't go very far in this context. Apple Computer's revenue alone was two orders of magnitude higher than that. Military spending is noise relative to consumer revenue in the valley.


Sorry, you just aren't informed on the matter. Not only does the military, CIA and other agencies pour billions in R&D development to SV, the CIA has run a hedge fund for almost 20 years that has significant ownership in hundreds of SV companies.

Here is a list of 219 tech companies owned in whole or part by the CIA hedge fund (just the ones they let us know about):

https://www.iqt.org/portfolio/

Here is their logo, which speaks for itself:

https://www.iqt.org/wp-content/uploads/2013/08/About-venn-di...

The CIA hedge fund and billions in R&D money are in addition to an unknown portion of the "black budget" controlled by various spy agencies, which is over $80 billion dollars this year.

https://www.washingtontimes.com/news/2018/feb/28/trump-admin...

The truth is that the tech industry is brimming with money from the military and spy agencies - and this is just what we know about.


Agreed. Thanks for sharing links. There's also an assumption that military necessarily equals evil. Do you think everyone who works for the NSA is out to curb our civil rights? It's not possible. And in fact, it's what a lot of people who work for the government or military believe in protecting.


There's a debate to be had over the "good" or "evil" that results in government/intelligence/military involvement in tech companies. In order to have a reasoned debate, people have to be aware of the facts. As evidenced in this thread (and elsewhere), many people are unaware of just how entwined our sprawling government and tax-payer funded quasi-government agencies are with "private" tech companies and others in SV.


I agree with you.

Except I have found this thread to be very novel, thought provoking, and seemingly well informed.


Palantir is just a minuscule part of SV ecosystem, entirely insignificant in the overall picture.


>And to think that SV has a large population of supposedly "small government" Libertarian Capitalists

Not only that, but when Youtube and Google block content, censor people, or otherwise prevent free speech, the common rejoinder is "they are private companies, they don't have to abide by the 1st amendment". Any company that is received taxpayer money, whether through direct subsidies, grants, partnerership, or any other avenue, ceases to be a "private company". I believe strongly that if you are truly private, and you operate entirely with your own private funds, and wholly-owned, privately purchased infrastructure, you are free to say and do what you want - silence any voice or opinion that you don't like. But once you receive public funds, in any context, along with that comes with the responsibility to the public. Despite being "the way it works" currently, this system is incredibly corrupt and logically inconsistent. In a world where the Constitution was respected, enforced and held inviolate, a company that received public funds (like Google) would not be allowed to declare themselves a private company with total autonomy on one hand, and grab countless millions in taxpayer money in the other.


Where does one draw a line these days among the personal, the moral, the legal, and the political?

The military application in question is legal and is approved by a duly elected government that supports it politically. In earlier days, employees generally would see this as just doing their jobs in developing technology that their employers wanted developed and would not concern themselves about ultimate uses and applications. In other words, doing your job is personal and, as long as you do it honestly and work hard, you should not be faulted for doing it as requested by your employer. That was always the standard. What then is the new element from which this sort of employee-driven demand arises? Is it morality? In other words, if I help develop A.I. that can be used for all sorts of things, one of which happens to be military-related, is the effort "evil" if the employer for whom I develop it agrees contractually to provide it to the government for a wartime/military use that can kill people? Do I really make a difference for the good if I convince my employer not to do this if all this means is that the company down the street gets the contract and the military gets the same results, albeit from a different vendor? If this is so, then I assume that you as an employee can make no practical difference in making the world better by insisting that your employer forego this particular form of contracting opportunity. If you succeed, your employer misses an opportunity but the evil you see being released into the world still gets released. It just means that you do not personally contribute to the development effort by which it is made possible.

Of course, it might theoretically be possible to persuade all persons working in the field of A.I. to ban further work that directly helps the military. But that would seem a practical impossibility. Many people in all countries believe that military technology of all kinds is proper, legal, and politically supportable for purposes of self-defense or for some other overriding purpose they deem proper. And certainly, there are bad people throughout the world who are eager to use any technology that comes their way for overtly evil purposes such as misuse of an atomic bomb. Unless and until human nature is fundamentally transformed, that will never change.

So, what is the answer in a country such as the United States where people and companies have the freedom to develop A.I. for any lawful purpose and where some inevitably will do so for a military purpose of which you disapprove?

You are then left only with a political solution: use political means to gain control of the government and the military and apply the force of law to ban the military use of which you disapprove.

So this is either a personal act of futility by the Google employees or it is a case by which they cannot separate the personal from the political and thereby insist that their employer sacrifice particular economic opportunities to ensure that your personal actions do not support a political outcome of which you disapprove.

Even then, does this mean that your employer should cease working on A.I. altogether? For, just as cash is fungible, so too is technology. Every improvement you make in A.I. might have an immediate use of x for your employer but, as humans collectively do this for all sorts of improvements, the results are there for the taking in the future for military applications of all kinds. In other words, you cannot put your improvement in a box or control it so as to limit its future uses (at least not in a free society). The computing technology of recent decades undoubtedly has bettered many aspects of life but it has also greatly magnified the lethality and utility of military applications so as to make the world far less safe. And this was inevitable unless a supervening agency were to have used forcible and totalitarian means to suppress such technological development from inception. Since no such supervening agency existed or even can exist in a free society, does this mean that all engineers and technical developers have blood on their hands because, ultimately, things they have done were used for applications of which they disapproved? Of course it does not. Nor would people today working on A.I. be held morally or legally responsible for ultimate downstream uses made of their work of which they would not morally approve today.

But this brings us back full circle. In the long run, you cannot stop such uses (or misuses) made from your technical development work. Nor can you be held responsible for them even though you contributed to them in some remote degree through your work efforts. Why then should it make a difference if your direct work efforts for a company like Google are applied to a military application of which you do not approve but which is legal, politically approved by the governing authorities, and will happen anyway regardless of whether Google is involved?

The puritans of old tried controlling the morality of others by shunning and shaming and doing it to an extreme degree. They failed miserably in their efforts because humanity is what it is and followed its own course without regard to external religious constraints.

This sort of effort by Google employees is obviously different in that it is not religiously driven but does it amount to anything more than a shunning-and-shaming method for trying to impose one's sense of morality on others by signaling that this way lies righteousness and everywhere else lies evil?

If this is what "don't be evil" now means, then Google will need lots of help going forward because every cause under the sun can be used in the same way to shun and shame. We then have management by a corporate board as may be swayed to and fro by any organized protest of the moment.

Whatever this is, and however it might be defensible in "sending a message" or whatever, it is a sure way to put a company at a competitive disadvantage while accomplishing nothing practically. It may further political goals but, if those are the goals, better just to try to advance them directly and not by attempting to shun and shame your employer (and your co-workers who may disagree with you) into submission. The personal need not be political. If it does become that way, a new form of puritanism will hold full sway to the detriment of all.


This argument is not only extremely lazy but positively representative of that large quagmire of social and communicative zeitgeist that Alan Kay calls "the bell curve of normality".

"It's hard to define things, so why bother at all?" ... "Attempting to circumscribe the affects of your work is difficult, so why have a moral stance at all?" ... dancing through loaded assumptions like "duly elected" and "democracy", finally concluding with a tired crescendo of capitalist "competitive disadvantage".

I would merely counter: if we are the future, then we can't all be lazy sods, especially those of us empowering the greatest systems, information and power structures on the planet. Give a damn, it's your moral duty. Intelligent people recognize this.


> In earlier days, employees generally would see this as just doing their jobs in developing technology that their employers wanted developed and would not concern themselves about ultimate uses and applications.

This is a false history. In the 1960s the U.S. was full of young people questioning whether they should just do as they were told and be a good, dutiful employee.

Since then there has been a massive campaign to roll back what was called “Vietnam syndrome” — the idea that you should consider the morality of your actions and contributions to society, not mere legality. Hence all the passionate Hollywood WWII dramatizations, the Greatest Generation, etc., portraying war as a tough but noble effort in which we must all unquestioningly sacrifice for the greater good — de-emphasizing much of the horrendous atrocities that have been perpetrated by the U.S. military in Vietnam, Iraq, support for murderous dictatorships in Central America, Indonesia, and so on.

Noam Chomsky has written extensively on this. One essay in particular is called The Backroom Boys — a reference to the chemical engineers at Dow who developed napalm within the relative peace and equanimity of a laboratory.

> Whatever this is, and however it might be defensible in "sending a message" or whatever, it is a sure way to put a company at a competitive disadvantage while accomplishing nothing practically… If it does become that way, a new form of puritanism will hold full sway to the detriment of all.

Whatever your opinion of the anti-war activists of the 1960s, puritans they were not.


>there has been a massive campaign to roll back what was called “Vietnam syndrome” — the idea that you should consider the morality of your actions and contributions to society, not mere legality.

Is that what is typically meant by "Vietnam syndrome"?

https://en.m.wikipedia.org/wiki/Vietnam_Syndrome

Also, isn't it a stretch to say Hollywood movies about WWII propagandize the idea one should only consider the legality of one's actions rather than the morality? If anything I would think most films attempt (in a sappy and trite way) to defend the rightness of the Allied cause.


Didn't say that. WWII was a case where legality and morality overlap more conveniently which is why it is a favored topic. Still debatable in many aspects e.g. Nagasaki, firebombing cities, not bombing the concentration camps, etc., but in comparison to say Vietnam, you see the point. You don't see too many films these days celebrating the nobility of unquestioningly doing one's duty to support the righteous fight in Vietnam.


Ok, I understand a little better now. I still think this "legality" line of argument is a bit of a red herring and I'm not sure where the poster you responded to got his ideas about how things were "in earlier days" and what "was always the standard."

I think it's probably fair to say that Vietnam was an eye-opener that shook a lot of people's trust in the wisdom of our society's leadership in general. And that a segment of society nevertheless responded like Kissinger by doubling down and shaming the doubters.


Your discussion of "legality" here is U.S.-centric. Google has offices in Germany, France, Poland, Russia, Turkey, U.A.E., India, and China, to name a few. How would you feel if Google worked on military technology for those countries? Would you point out that it was inevitable that their militaries would seek ways to use A.I.? Would you point out that the Turkish armed forces' activities are permitted under Turkish law? Would you lament the inability of Google employees to separate work from politics?

Google has users in almost all countries, and even our friends in other liberal democracies do not see the U.S. military the same way we do. Perhaps some Google users' family members have even been killed by the U.S. military. This presents a perfectly reasonable business reason (one that has nothing to do with "the personal, the moral, the legal, and the political") for Google to turn down AI drone contracts.


I am from one of the country you listed above and it seems perfectly fine to me that Google comply with existing laws of respective countries even if they are in contradiction to US laws.


FWIW, I am American, and I would absolutely object if Google were building weapons for the Chinese military. I would stop using Microsoft products if it turned out they supplied weapons for the Russian takeover of Crimea. I would delete my Twitter account if they were found to be building special-purpose propaganda tools to aid Turkey's Erdogan. Etc.

Complying with laws in another country is one thing. Working with a military, which necessarily has implications beyond that country's borders, is another. And of course even here there are different degrees. You can build a general-purpose secure email client and sell it to a country's military, or you can design their bombs. Where the line is I'm not sure, but at some point your activity is inherently violent, inherently adversarial to some fraction of people in the world.


I am more conflicted about at least one person I knew committed suicide because IT automation provided by Google eliminated his job. Or in general IT/industrial automation wreaking havoc on my highly populated but poor nation. Now is it just the price of progress as people here would say or should Google employees be held morally responsible for causing destruction of livelihood for many a people.


I bet neither China, nor Russia, nor the US would allow foreign citizens in foreign countries develop any serious military technology for their armies. Even their own citizens would be checked in detail before being allowed to work on it.

If a technology is not under that kind of scrutiny, chances are its military applications are... far-fetched.


Lots of military technology is traded between countries all the time, tanks, planes, guns, boats, missiles, etc. China just bought a bunch of Su35 jet fighters from Russia.

http://nationalinterest.org/blog/the-buzz/chinas-air-force-m...


We used to have a moral military, we championed the banning of weapons such as gas, bio, nuclear, mines, etc and we established international institutions and protocols to stop their spread. At some point we turned evil and have been destroying the foundations of international cooperation that we built and are building and spreading new categories of weapons without concern or moral debate. We are the bad guys now.

We are starting to see the shape of the AI driven future and it is not pretty. Autonomous drones and other robots and surveillance patrolling systems establishing strict inescapable authoritarian control.

The elite employees at the global AI leader are best suited to see the coming dangers and are sounding the alarm bells. The outlook is bleak but moral engineers are going to be one line of defense in this fight. And I hope they keep doing it.


The US has never signed the Ottawa treaty, banning the use of land mines. The US do however claim to abide by it, outside of the Korean peninsula. The US never signed the treaty on cluster munitions either, which have many of the same humanitarian issues that mines possess.


>if I help develop A.I. that can be used for all sorts of things, one of which happens to be military-related, is the effort "evil"

There's a famous quote for this:

It should be noted that no ethically-trained software engineer would ever consent to write a DestroyBaghdad procedure. Basic professional ethics would instead require him to write a DestroyCity procedure, to which Baghdad could be given as a parameter.

https://en.wikiquote.org/wiki/Nathaniel_Borenstein

Of which I believe the meaning is yes, it's evil. It's handing a toddler a loaded gun sort of evil. If you DestroyBaghdad, you've limited the harm your program can do to what is specifically required by the situation. DestroyCity is easily misused in the wrong hands and should be carefully considered by ethical programmers.

Doctors solve this by disallowing unethical members of their profession to legally practice. Programmers should consider becoming an ethical profession, because depending on others in the field to do the right thing and police themselves hasn't been working out.


It's relatively easy to prevent the unlicensed practice of medicine. But anyone can buy a computer and start programming. There's no practical way to require that all programmers adhere to a code of professional ethics.


I completely agree with the loaded-gun metaphor, but doctors are a very different kettle of fish.

Doctors are healers. The Hippocratic oath - "do no harm" - is the logical conclusion of the practice of medicine. Medicine heals, which is the opposite of causing harm. Avoiding harm is the only consistent metric of success, which explains the oath's persistence for millennia.

Can you think of a consistent, concrete set of ethics that would draw unanimous support among programmers?


I think healing has less to do with it than liability. Snake oil salesmen used to be a thing.

What currently sets programmers apart is the lack of liability. Programmers write their own get out of jail free cards. We call them EULAs.

If a doctor screws up and leaves a clamp inside you after surgery, he is sued. If a programmer screws up and leaves a debugging backdoor in a shipped product, nothing.

>Can you think of a consistent, concrete set of ethics that would draw unanimous support among programmers?

I think if programmers can't come to a consensus on that answer, then legislators will do it for them.

If you look around, we're actually witnessing this happening right now. Populist anger has erupted after Equifax, Cambridge Analytica, and Uber. NYT opinion pieces call for changes in liability law around programming.

https://www.nytimes.com/2017/09/11/opinion/equifax-accountab...

And it's not just talk. Changes have already started. Section 230 was recently modified to make small changes in liability of web hosts. In response, Craigslist went full nuclear option in protest and dropped their Personals section. Almost nobody noticed, which means in the next round, law makers will be much more bold in applying more liability to the businesses of programmers.

Google's "Do no evil" was the closest thing I think we've witnessed to a Hippocratic oath for programmers. That's long gone now. Now it's all jerk tech, exploit your users for content and then demonetize them with no recourse or redress.

I don't think the west can get any wilder, so the pendulum is going to go against us from here on out. Programmers should be getting ahead of this, but like all dumb humans, we will sit stupidly. We will only react to immediately obvious consequences instead of preparing for the storm on the horizon.


There's a lot to unpack here, but it seems that the gist of it is that things that happen in our society are because humanity is some kind of untamable animal, and that we should all just resign to letting it run wild as it does.

Do you not believe that society is only the sum of its parts? Do you not believe that the mathematics of society can be changed, the more parts of the equation object to letting their talents be used for unscrupulous goals?

I would point you towards any cultural shift in modern society, and how it began—usually, as the imbalance of classes further divides, until one class can't tolerate it any further, and uses what power they have to reset the scales. That is what is happening today, and it isn't a fluke of the short attention span of the beast of humanity. It is a conscious, concerted effort of people in this country who are tired of existing in a system devoid of morals. And to frame it as something like embracing the status quo, or becoming a puritanical society, is simply a false dichotomy.


In earlier days, employees generally would see this as just doing their jobs in developing technology that their employers wanted developed and would not concern themselves about ultimate uses and applications.

You might want to read some history. E.g. history of the development of nuclear weapons.


>In other words, doing your job is personal and, as long as you do it honestly and work hard, you should not be faulted for doing it as requested by your employer.

This sounds remarkibly similar to the "superior orders" defense given at Nuremberg [1]

(Those who gave it were hung from the neck until dead.)

[1] https://en.wikipedia.org/wiki/Superior_orders


I really appreciate your thoughtful comment!

But I also wonder, if your employer asked you to provide some legal cover for something you found unconscionable -- like maybe draw up incorporation documents for organizing sex tourism to a place where it's both legal and likely to involve slavery -- would you subscribe to the same argument about how you should just do your job and not ask questions, how someone else will do it if you don't, and dutifully provide the legal services?

And if you think that hypothetical scenario is meaningfully different from this one, could you describe how? (I don't mean to try to back you into a rhetorical corner -- I'm genuinely interested in your response.)


> The military application in question is legal and is approved by a duly elected government that supports it politically. In earlier days, employees generally would see this as just doing their jobs in developing technology that their employers wanted developed and would not concern themselves about ultimate uses and applications. In other words, doing your job is personal and, as long as you do it honestly and work hard, you should not be faulted for doing it as requested by your employer.

This is basically the "just following orders" defense. Your argument rationalizes doing nothing.

As a moral person I think you basically have two choices: either don't work on things you think can be used for evil or if you do work on those things step up and make sure they are used responsibly.


It can be seen as a message (as you say) to a wider populace that this is all getting in a wrong direction. If government is hiding from its own populace what its doing with drones, who should raise this issue? Why not people working on the drone program (regardless of where exactly). It's much easier for newcommers than incumbents I'd say, so it make sense some Google employees would be ones to protest.

You never know what will be the initial trigger for change, who will be inspired or whatever. Take this guy: http://www.ecns.cn/2018/01-02/286632.shtml Why should some basketball celebrity and his campaign have so much effect?


Don't forget that Google is a multinational corporation, whereas there is no multinational government with political control by the people.

Google supporting American military may have negative effects on Google the corporation, especially considering that the rest of the world is a bigger economy than the US.


> Don't forget that Google is a multinational corporation

Google (and it's parent, Alphabet) is a US corporation, predominantly controlled by a pair of American individuals, with overseas operations and subsidiaries.

It's “multinational” in much the same way that the CIA is.


There are still people living who, in the middle of the last century condemned other people to death for only doing their jobs. Of course, a future AI may decide to simulate our reality and torture you for a subjective eternity if you do not do everything in your power in this reality to bring that AI into existence. You plays your cards and you takes your chances, but in general its simpler if you try to do what you honestly believe is morally right. And if you honestly believe that the right thing to do is pummel our society with shitty advertising then please do your very best.


Thank god, some sense! It was driving me crazy with how many irrational positions that people in this thread are holding without considering their orientation to the product (AI) and their relationship to the employer.

Something I would add is that a lot of people don't understand how fundamental military R&D is to the collective progression of knowledge and technology. Take almost any common technology that we use today (computers, gps, rockets, airplanes, cell phones, radios, the internet, etc) and you will find it came from military R&D and use in war.

Since AI and all its related parts are the new technological hotness, to put it mildly, it only makes sense that Google, one of the companies on the forefront of this technology, would work with the government/military to do research and find ways to apply it with their scope.


Part of the reason so much stuff comes out of military research is simply that the military has such a huge budget it can spend it on R&D.

There are definitely useful dual use technologies. But there's also a lot of military research that is almost strictly for military. Nuclear weapons research goes beyond the stuff needed for power generation, for example. Money that could have went into research in more stable, safer power generation instead went into how to make nukes small enough to be used by infantry on the battlefield

The military helps a lot by being a huge customer for a lot of this tech, but we can also cut out the middle man and just spend on r&d directly in some cases!

By discussing the morality of weapons research in a world where we already have civilization-ending technology ,we can maybe reorient ourselves to spending directly on progress in more cases. Without needing to have a military application to justify it.

This happened in the past with the neutron bomb, perhaps it can happen again with tech that could be used to help solidify a police state


There is no logical connective in your implied argument. Just because A is trying to buy from B does not mean that B should sell things to A, even if A's spending would increase the quality of B's product.


Since we live in a market driven economy, the government (A) will always find a way to buy what they want. So if Google (B) doesn't supply the demand, the customer will just go somewhere else. So it is completely illogical for Goolge to move away. They are an extremely powerful business and are subject to the rules of business.


This may be an uncomfortable fact but people have surprisingly short memories: the military funded the majority of the early advances in systems, networking, and cryptography (and especially as a large part of the latter subject area, invested heavily in fundamental, theoretical research). Not saying that I disagree with the employees' opinion, just that DoD/Pentagon involvement in artificial intelligence research shouldn't be viewed as a necessarily bad thing. Many other major powers have heavily invested in AI across all fronts (including military applications), and it would be stupid for the US to not have one of its' largest strategic assets to not be part of the process.


My memory isn't short.

You made a point of talking about cryptography. The US government also classified crypto as munitions in order to control its usage and export: https://en.wikipedia.org/wiki/Export_of_cryptography_from_th...

This had very negative effects on cryptography in general (see the FREAK exploit: https://en.wikipedia.org/wiki/FREAK )

Also, why wouldn't this be employed to better monitor Stingray (https://en.wikipedia.org/wiki/Stingray_phone_tracker) systems?

I don't think the government (and specifically, the military) should be viewed as a non-partisan actor when it comes to technology.


The classification of cryptography as munitions was an interesting choice. The first amendment may protect domestic encryption capabilities as stated by the wikipedia article. From a legal standpoint, does encryption being classified as munitions also put it into second amendment territory? As information becomes weaponized, it's an interesting thought. The first amendment is arguably more vulnerable these days, so I wonder if perhaps a second amendment argument could be made in support of domestic encryption. It certainly would help with marketing the argument to a certain segment of the population.


IANAL but I’m pretty sure the second amendment only applies to guns; you can’t walk around with an RPG on your shoulder.


It depends how you look at it. What was the purpose. Presumably after fighting in the Revolutionary War the goal was not to give people hunting privileges, they had something else in mind. From that point of view, maybe RPGs should be allowed, but so should attack helicopters, surveillance capabilities, private spy satellites, etc.

> IANAL but I’m pretty sure the second amendment only applies to guns; you can’t walk around with an RPG on your shoulder.

Surprisingly perhaps you can buy a military grade flamethrower complete with a napalm package: https://throwflame.com/ it's not even classified as a firearm, and is illegal in only two states I think.


This has always been my way of looking at the 2A. It's clearly intended to give citizens the rights and tools to fight wars, otherwise it wouldn't mention it along with militias. Thus, it does seem logical that it extends to armaments of war, such as knives, firearms, artillery, bombs, etc (perhaps even hacking tools).

I'm a gun owner myself, and I've (long ago) taken the stance that the 2A isn't really compatible with modern life, as the people who originally wrote it could not have anticipated the weapons militaries deploy today. It makes sense in my mind that it is Congress' responsibility to regulate what it reasonable weapons for civilized society.


Congress does regulate those weapons.

There's a reason you can't just walk into a Wal-Mart and walk out with a machine gun, or a rocket launcher.

That said, I don't much care for the fact that the requirement to have legal access to machine guns is "be rich enough to buy your way around the law".

Personally, I do think we need a complete re-work of the way we treat weapons to reflect the realities of the modern world.

(1) We have four hundred million weapons in the United States. It is horribly negligent that safe weapon handling is not taught in schools, at multiple levels. Also a good touchpoint to get police interacting positively with the community.

(2) The current background check system should be replaced with a Swiss-style one; e.g., you get a code, valid for say a week or two, that any seller can use to verify that you have passed a background check.

This doesn't create a de-facto registry, so no problem getting gun owners (or the NRA) to go for this one.

(4) A red-flag law that allows immediate family members and healthcare professionals to temporarily bar an individual from purchasing weapons, with safeguards in-place to keep bad actors from abusing the system (e.g., a psychiatrist that just reports all of his patients to the system because he opposes gun ownership).

(5) Removing gun rights only in the case of violent crime. Right now, in most states, misdemeanor assault won't cost you your gun rights, but a felony for tax evasion will. That's ass-backwards.

(6) Machine guns, grenades, rocket launchers, and the like should be available with a license. You really need to be able to show that you can handle these things safely. Right now, it's just "be rich".

(7) Treat short-barreled rifles and shotguns as normal firearms, and silencers like any other accessory.

(8) Concealed-carry should be the same for police as it is for normal citizens. Same licensing requirements: a one-day course, including a test on safe weapon handling. This also ends gun-free zones: if you have a license, you can carry wherever you want.

(9) Open carry needs to work differently for urban and rural areas. The above permit will allow you to open-carry a pistol in an urban area. Rifles, no. Put it in a case. In the countryside, open carry just makes sense.

(10) Actually prosecute straw sellers. Buy a gun for somebody that can't legally buy one on their own, you permanently lose your gun rights, plus whatever punishment makes sense.

(11) Secure storage requirements. If one of your weapons is stolen and used for a crime, you are legally an accessory if you can not demonstrate that you took reasonable measures to store the weapon securely (e.g., a gun safe or lockable case). This doesn't require police inspections, but provides a strong incentive for personal responsibility.

All of this would enjoy massive support from gun owners, and address a lot of existing problems.


Seriously. Congress should just concentrate on this list. Just put language around this, and I think they'd have a very reasonable set of gun laws.


These all seem well thought out and reasonable. Thanks for sharing


That flamethrower is awesome. What use does a flamethrower have though besides burning buildings, that a normal gun wouldn't?


Clearing brush. It's considered an agricultural tool IIRC.


Also for preburning (ahead of forest fires) IIRC.


Are you saying that literally fighting fire with fire is a thing?

I may have missed my calling . . . .


I mean... yes?

That is where the saying comes from.


Never really thought about it. But it makes sense to cut a "gap" in the forest using fire, to prevent a fire from crossing the gap. Once burned, something aint gonna burn again...


> What use does a flamethrower have though besides burning buildings, that a normal gun wouldn't?

It can be more effective at clearing bunkers and closed rooms? The military were using these in WWII for trenches, bunkers, clearing brush. My friend's grandpa in US was operating one in the Pacific during WWII.

I just brought it was an example how something potentially more deadly than guns or RPGs is not much regulated and how laws are not very rational sometimes.


> It depends how you look at it.

I recently purchased the Federalist Papers so I could have a better understanding of "how" to read it, what the context was, and what types of discussions to anticipate having with those who'd like to look at it differently


And what have you learned for those of us who haven’t read the federalist papers?

Also, why do they need to be purchased? Isn’t that information in the public domain? Surely enough time has passed and Alexander Hamilton’s family don’t need to continue to profit from it?


They don't need to be purchased(http://www.gutenberg.org/ebooks/1404). Some people still read physical books, though, which generally requires purchasing if they wish to keep the book.


Can’t speak for the parent, but I buy public domain books because it tends to be cheaper than the paper and toner required to print it. And it looks nicer.


IAANAL but I'm pretty sure the second amendment is actually really vague in terms of what it refers to, and has actually been tested in court less often than one might think.


How it's written I personally think you should be able to buy any weapon available on the open market. Or at least any weapon that could realistically be used by an infantryman (as the wording is "A well regulated Militia, being necessary to the security of a free State,."

I'd rather have a new amendment then have judges be able to decide what the definition of arms is. Slippery slope. They really could one day decide that 'arms' is only muskets, or even just knives.


Broadly speaking, that's been the holding of the courts since 1939.

U.S. v. Miller loosely established the test (later solidified in 2008's Heller v DC) that weapons commonly used in militia service are inherently deserving of second amendment protections.

Miller, a known mobster who was caught with a sawed off shotgun lost his case because

a) the military testified that they had never used sawed off shotguns, so they were not useful to a military, ergo a militia (which was a lie -- they had used them, and found them useful for trench-clearing)

b) Miller's attorney was not very good, and didn't even challenge that testimony, much less so by totally disproving it -- I offer a little sympathy here as they didn't have Google at the time

Also noteworthy, Miller was actually dead when the decision came down, as he'd been murdered, and because /shrug, the trial kept going, but the defense (for obvious reasons) quit trying. He was sentenced in absentia.


You say solidified, but it was a 5-4 ruling. To determine if handguns could be banned by the city. And an 'arm' that has been used by infantry forces for generations.

Which I find just insane. People say no one wants to take Americans guns rights away, but we were literally one vote from effectively doing so.


Whenever someone says "no one wants to take your guns away", now you can just point them at this and see whether they maintain that position:

https://www.cbsnews.com/news/illinois-town-votes-to-ban-assa...


Except you can own an RPG in the US... It's an NFA classified destructive device. You can also own cannons, tanks, or attack helicopters if you have the money.


Isn't the 2a just a "well armed malitia"? Maybe an RPG is more than well armed, but encryption?


"well-regulated militia," but it's pretty much an open question how much that actually has to do with the right described in the second clause. Gun rights activists tend to down-play that first bit, as it implies a collective perspective that they find conceptually incompatible with unlimited individual gun ownership rights.


My state constitution clarifies it as "The right of the individual citizen to bear arms in defense of himself, or the state..."


Your state constitution doesn't particularly get a say in what the US constitution means (so says the 10th amendment).


That's not necessarily true. The Supreme Court has ruled that the 2nd amendment only prohibits the federal government from "infringing" on your right to bear arms, and not the states.

https://en.m.wikipedia.org/wiki/Second_Amendment_to_the_Unit...


> The Supreme Court has ruled that the 2nd amendment only prohibits the federal government from "infringing" on your right to bear arms, and not the states.

This was true only between 2008 and 2010. In 2010, the Supreme Court clarified that it was incorporated against the states by the 14th Amendment.


How is it then that different states infringe on the right to bear arms to different degrees?

And it was indeed the case well before 2008-2010. See United States v. Cruikshank (1875), Presser v. Illinois (1886), Miller v. Texas (1894).


See DC vs Heller, 2008.


Does it really say ‘himself’?


Yes, it likely does. With English lacking a gender-neutral pronoun, "he" or "him" or "his" historically did not necessarily imply the male sex, depending on context. As "man" can refer to all of humankind, again it's a matter of context.


Yet the actual second amendment seems to avoid the problem as far as I can see. Edit: And of all the second amendment issues that could be argued, I somehow picked this one.


I wish this was the only surprising bit in the constitution of Alabama. The section outlawing interracial marriage was only removed in 2000, and it still contains a section mandating segregated public schools.


> "it implies a collective perspective that they find conceptually incompatible"

so says you. it's not clear to me at all that militia members don't support collective action


The militia in that sentence IS the collective action. How many gun owners are part of a well-regulated militia?

Even the NRA removed references to that first clause from their material.


To be fair, militias of the day weren't generally standing militias, and needed to be mustered.

We do have definitions as to what militias are informed to us by writings of the founders, the federalist papers, previous drafts of the second amendment and, failing that, codified by law in 10 US Code § 311. The definition of militia there ("consists of all able-bodied males at least 17 years of age and, except as provided in section 313 of title 32, under 45 years of age who are, or who have made a declaration of intention to become, citizens of the United States and of female citizens of the United States who are members of the National Guard") will likely have been expanded by our recent advances in military equality.


Some context into the mindset and beliefs of the people who penned the original amendment is helpful in this case, but you find very little in support of the level of control that the anti-2A people want.

Gun rights antagonists tend to downplay that bit, as it implies an individual perspective that they find conceptually incompatible with tightly-controlled gun ownership permissions.


They also downplay that in every case of a gun being used in some criminal way, numerous laws and regulations are already being broken. So what difference will more regulations make? The only thing that could possibly give the gun-control people what they really want is total confiscation of all guns, and even if that were somehow possible there would be a civil war over that if it were to be seriously attempted.


> The only thing that could possibly give the gun-control people what they really want is total confiscation of all guns

Strawmen are so unhelpful in reasonable debate.

Not all gun control advocates are either anti-second amendment, nor in favor of eliminating all guns. To say otherwise, creating a false black/white dichotomy in the gun debate, does a massive disservice to both gun advocates and gun control proponents.

Ironically, by making the choice "all guns" or "no guns", gun advocates themselves are forcing the "no guns" option to the center of the debate. As a wave of frustrated anti-gun youth become voters and reasonable political moderates look at options to "protect the children", I really think it's in gun proponents' best interests to provide a better alternative than "do nothing" on one side and "civil war" on the other.


It ceases to be a strawman when it's on a sign held by protestors acting in good faith[1].

At the end of the day, these stances, even the "moderate" ones you mention, are irreconcilable all the way down to first principles. If you are for gun control, you are necessarily for measures that will restrict the right to bear arms as it is recognized today, some more, some less, but restrictions on the right all the same.

There's no real evading that.

[1] https://pics.me.me/yes-i-do-want-to-take-away-youur-guns-you...


Of course some gun control advocates want a "we've come to collect your guns" law. I never said that no one holds extreme positions.

But that does not mean that everyone holds extreme positions, which is what you claimed.

Arguing that the extremes are the only options is a problem.

How productive would health debates be if the only two options presented were veganism and paleo? If the only sex ed options are abstinence or polyamory?

There must be room for compromise, or there is no debate, only argumentation.


The problem is that the "compromise" touted by gun control advocates is entirely one-sided - more restrictions, no concessions. No quid-pro-quo, only further restrictions on the right.

That's not compromise, that's capitulation.


This leads to an honest question on my part:

What concessions would gun rights advocates accept in order to allow some restrictions? What's there left to give on this issue that wouldn't undermine any controls.

Say, for example, I wanted gun owners/users to be required to be as responsible as car users, i.e. pass a test, maintain a license and registration for weapons and weapon users, and hold liability insurance to cover damage either intentional or accidental (that would obviously scale with the likelihood and amount of damage the gun can do). What can gun control advocates give that will get that done?

I think part of the reason the gun debate can be one sided from the "control" side is that the US already is quite far to the "rights" side of the spectrum, relative to the rest of the developed world. It can be difficult to see where we could plausibly move further in that direction without causing more of the problems we're (hopefully) all trying to solve: unnecessary bloodshed.


In your example, since we now have a mandatory testing/licensing scheme and insurance, I see no reason why CCW's countrywide shouldn't become shall-issue. There should also be a stipulation that licensing is a thing you grant to people, not individual weapons.

Another example I floated in previous threads is surfacing psychological issues in NICS checks (stuff like certain diseases or involuntary holds) and granting access to that system to everyday people rather than just retailers.


agar said that not all gun control advocates want to repeal the second amendment, not that there aren't any gun control advocates who want to repeal the second amendment.


There's a school of thought that the word "arms" in the Constitution might have referred mainly to sidearms, not to canons, catapults, warships, etc. The theory is that there were other words like "artillery" to describe larger weapons.


>"Justice Scalia also wrote: “It may be objected that if weapons that are most useful in military service — M-16 rifles and the like — may be banned, then the Second Amendment right is completely detached from the prefatory clause. But as we have said, the conception of the militia at the time of the Second Amendment’s ratification was the body of all citizens capable of military service, who would bring the sorts of lawful weapons that they possessed at home to militia duty. It may well be true today that a militia, to be as effective as militias in the 18th century, would require sophisticated arms that are highly unusual in society at large. Indeed, it may be true that no amount of small arms could be useful against modern-day bombers and tanks. But the fact that modern developments have limited the degree of fit between the prefatory clause and the protected right cannot change our interpretation of the right.”

[1] https://takingnote.blogs.nytimes.com/2015/12/11/justice-scal...

I think for a weapon to be 'compliant' with US's second amendment a couple of criteria must be met

a) must be in current wide use in military

b) must be capable of being carried by a single person (which is why tanks and fighter jets will not qualify) This follows form 'individual' focus of the bill of right.

c) must be capable to aim it at a single person (which is why explosive or RPGs would not qualify). This follows from the notion that Bill of Right in general, does not condone collateral damage or collateral effect. As this is Individual's right, and therefore presumes individual's responsibility.

I also find this linguistic analysis of what 'bear arms' meant in 1791s, interesting/educational [2] http://languagelog.ldc.upenn.edu/nll/?p=255


This school only makes sense if you throw out the entire section that mentions a well regulated militia. A militia would be expected to have artillery.


The 2nd amendment doesn't apply to foreigners. Classifying encryption as a munition meant that I have never truthfully answered all the visa waiver questions on entering the US.


Fischer v Massachussetts[1] (and other cases won by the second amendment foundation) holds that resident aliens are counted amongst "the people" for purposes of second amendment protection (and other rights too!)

[1] - http://ia800500.us.archive.org/20/items/gov.uscourts.mad.135...


resident alien != foreigner

I believe GP was referring to times s/he's entered the US on a visitor visa of some sort.


Then there may be a de facto restriction against the exercise of such a right by such a person, as one needs to have a permanent address to comply with the laws as they are currently interpreted.


It does. A permanent resident alien can purchase and carry firearms in most states, and nonresident aliens can get a license to do so as well.


does encryption being classified as munitions also put it into second amendment territory?

It doesn't much matter. Even if somebody made the case that "crypto is a form of 'Arms', subject to 2A protection," the 2A doesn't grant an unregulated right to own whatever "arms" one chooses. All the rights granted by the Constitution are subject to reasonable regulation.

So, Congress could simply legislate a restriction on crypto, much as they have done with machine guns, grenades, and nukes.


It could be argued that the 2nd amendment was put in place to ensure a means of the citizenry to overthrow the government (as an absolute check on an unjust government). Unfortunately it's now just a vestige of an older time and the 2nd amendment only serves to protect the rights of gun hobbyists.

The guns that a person are allowed to bear are nowhere near sufficient to take on the US military, and the modern arms such as encryption and freedom from surveillance are not even guaranteed by the 2nd amendment.


>The guns that a person are allowed to bear are nowhere near sufficient to take on the US military,

You're getting into spherical cow[1] territory if you think the US citizenry couldn't defeat the US military.

The US military will not be fighting the US citizenry in a hypothetical clean room, it'll be fighting in America. Who are you going to bomb, who are you going to strafe, when your opposition can quietly fade into the populace that provides your material?

1. https://en.wikipedia.org/wiki/Spherical_cow


> The guns that a person are allowed to bear are nowhere near sufficient to take on the US military,

I'm tired of reminding HNers about Aghanistan.


IEDs were far more of a problem in Afghanistan than small arms.


Yep. And who do you think have the easier time getting ingredients for an IED, the person with a gun or the person without a gun?

Who has the better chance at ambushing and getting away with the enemys weapons: the one with the gun or the one without a gun?


You can do a lot of damage with IEDs, but you can’t overthrow the government with IEDs.

Look at Turkey. The primary tool was encrypted communications and information dissemination. That’s modern power that citizens should have. Right now power comes from information and the preservation of speech and private communications, not a few handguns and rifles (or IEDs).

When the 2nd amendment was written a gun was relatively way more powerful than it is in today’s world. If you had a group of men with guns you were basically on par with the government military.


IMO Turkey has mostly failed now.

However, I'm not saying I'd want them a civil war, just that it failed despite doing everything "right" it seems.


That's an away match.


Can you elaborate?


I believe he's referring to the Taliban's ability to hold off two modern armies (USSR in the 80s, USA in the 00s) with a much more basic arsenal than either super-power brought to bear.


380 million Americans, 2 million members of the military.

Unless the US were to nuke itself, given the geography, a motivated citizenry could easily take on the US military. Not a scenario I want to imagine, but the US military has troubles in Afghanistan and Iraq — imagine that kind of conflict in the US. Just look at rebels in Syria to see this on a smaller scale.


Where does it say you can regulate my rights?


WRT the 2A, it's right there in the text: "well regulated".

WRT other rights... Explicitly? Not always in the Constitution. But, if you think you can cry "Fire!" in a theater, you would be mistaken.

SCOTUS has upheld reasonable restrictions on rights over and over. It's a settled matter.

(edited - cleaned up my thoughts a bit)


"Well regulated" in it's historical context doesn't mean anything close to what we think of as "regulation" today. The phrase means "functional" or "functioning". E.g. “If a liberal Education has formed in us well-regulated Appetites and worthy Inclinations.”, or "The equation of time … is the adjustment of the difference of time as shown by a well-regulated clock and a true sun dial.”


Stop quoting that case. That was SCOTUS upholding the right of the government to clamp down on speech that encouraged draft dodging or could even be marginally interpreted as disagreeing with it during wartime.

https://www.popehat.com/2012/09/19/three-generations-of-a-ha...

It ain't Dred Scott dumb, but it's up there in stupidity and retarded law.

Well regulated doesn't mean the feds get to control the details of what the militia can use either. The militia is every able-bodied 18-47 year old. The regulated part implies a command structure which the National Guard provides.


>The regulated part implies a command structure which the National Guard provides.

That's completely wrong. Each of the 1st ten Amendments - The Bill of Rights - enumerates rights of the individual against government power. Its absolutely absurd to argue that the 2nd amendment somehow refers to a government commanded and organized military body and not to the right of citizens to remained armed to prevent the government from getting out of line.


I'm not sure what case you're thinking of, because I didn't explicitly mention one.

Chaplinsky v. NH was in 1942 and had nothing to do with draft dodging.

Since then, the courts have ruled inconsistently on the matter. The only common thread is the courts mostly agree that some restriction on speech is permissible - where that line lays is very much up for debate.

WRT the militia - the government disagrees with your assertion. A random citizen cannot go into the gun store and buy a machine gun, or grenades. Special (and expensive) permitting is required. If you're contention is that these restrictions are unConstitutional, I'm not really sure we should bother debating the point.


They were responding to the mention of "fire in a crowded theater," a reference to Schenck v. United States.


Regulated does not imply a command structure, look up the 18th century definition of that word please. Also, to be clear, the National Guard IS the government. The purpose of the 2nd A was to provide a check against the government— it wasn’t a protection for the states — it was a protection for individuals — the Bill of Rights was a declaration of individual liberties not protections for the states. The tenth amendment is an outlier, but even that was part of a coordinated effort to limit the size, scope and power of government.


It does not, because the 2nd amendment protects arms, not munitions (for example, bombs).


Here is the entire text of the 2nd Amendment:

"A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed."


What's the difference? The 2A certainly applies to cannons. Are those arms or munitions? Grenades are legal to own, just need the tax stamp.


Arms are munitions. Not all munitions are arms.

More information: https://en.wikipedia.org/wiki/United_States_Munitions_List


I think the distinction is between "arms" and "ordinance". Reading through the comments here, it seems "munitions" is a super set of both.


Nuclear weapons are munitions. A U235 bomb like Hiroshima’s is technically a gun even. Do you have a right to carry one of those?


If you can carry a Hiroshima bomb, I’m ok with letting you.


Novel thought. I could definitely see an argument for it falling under 2A.


This is actually a question that was featured in an xkcd comic[1].

[1]: https://www.xkcd.com/504/


We also know that independent researchers often invented new cryptographic techniques in parallel with the military.


this itar business had a huge chilling effect on the formative years of internet protocol design. public key was available at the time. here were alot of loud voices in ietf clamoring for strong foundational security. The US position made it highly impractical to use, and may have cost us an effective and ubiquitous security/identity model.



They have an agenda that is clear but it doesn't mean they should be shutout because the investment they make can help push new ideas forward.


Alternatively, there is an abundance of private sources of funding.

One of the founding principles of American government was the freedom from state surveillance and intrusion into private homes. Nowadays, the federal government of the USA can legally use any technology that is patented, so why should they be allowed to restrict inventors from disclosing or selling IP that inhibits surveillance?


We should embrace the development of a surveillance state because it could spin-off some cool tech? Hmmm.


It worked when they created the internet. The original purpose was not what the internet became.


Yes, they made some bad choices, and then they corrected them. They also funded the research that created the field. Things can be morally mixed. On balance though, the US military funding for cryptography has been pretty clearly a good thing.


I love how any defense of US military history often boils down to, “sure they did terrible things in the past, but we definitely learned from them.”

I’m sure people have been saying that for decades if not centuries. Terrible things just get declassified slowly so we only learn about them later.


You realize that the US military is the reason the world isn't ruled by Hitler and/or Stalin, right?


Oh man, is this why I have to complete all this encryption and export compliance bullcrap every time I have an iOS release?


Yes


Why don't I have to do it for Android?


Maybe you are only distributing your app to people in the US? Or maybe Android apps only use crypo built into the OS and done actually ship with any.


I hear you, and I want to make a "times change" argument.

This is my perspective, because I read too much scifi: I can conceive of a couple possibilities in 300 years.

1. Humanity is gone or stone-aged. Either because it failed to colonize before being wiped out by disease, because it nuked itself, or because it implemented AI in a way that got itself killed.

2. Humanity has turned peaceful, formed a global community (if not wholly, then at least nearly so regarding scientific resources), and colonized.

I draw such a sharp line because I don't see how the resources to colonize can be mustered without justifying it with an arms race unless peaceful cooperation is established. Without this, I think somebody is going to put a nuke in orbit around Mars and call themselves King/Queen of the inner solar system.

Therefore, I think we should be working towards the cultural changes that I believe necessary immediately. This is why I'm a staunch proponent of universal education, universal healthcare, gun control, etc. I think there's no better time than the present to "be in the race together" globally than now. Given climate change, death of bees, super-bacterias, and narcissists with penis-size complexes holding fingers over red buttons, I see a ticking clock.

Avoid (1) by tackling culture, is my theory, and I think that's what these googler employees are here. (I post this very much looking forward to being challenged on all points)


> Either because it failed to colonize before being wiped out by disease, because it nuked itself, or because it implemented AI in a way that got itself killed.

If the probability of a kind of disaster is sufficient to blow up Earth, it is likely that the same kind of disaster will blow up all the handful of colonies we can hope to build. At astronomic scale, the speed of light is quite slow.


Potentially. I think the possibility of quarantine is much higher between planets, though.


If you're so worried about the world destroying itself with its own weapons, how are you going to enforce your weapons ban without using weapons?


But if all the guns were gone we wouldn't need any anymore, ergo world peace, cats and dogs living together, and mcdonalds becomes vegan. /s


The same way Japan "enforces" around violent and petty crime - by removing the need for it, and establishing a culture opposed to it.

It's not nearly as easy as that but that's the basic summation of my philosophy around the idea.


I worry that culture doesnt help at the top of leadership.

Once you get to the top of an organization, only a few people need to listen to you. As long as you keep their paychecks and power, they will listen to you.

Dictators dont care if war is unpopular as long as its beneficial.

I think nukes may be obsolete because war is no longer beneficial, but leadership of a company may be 100% set on finding AI by any means necessary.


Can you clarify your second line, "Once you get to the top of an organization, only a few people need to listen to you. As long as you keep their paychecks and power, they will listen to you." Are you saying that everyone has to do what you say because you pay them? If so, I don't think that point applies to this situation - we're talking Google engineers, they shouldn't be hard pressed to pick up their things and find a job elsewhere that aligns better with their values.


Capital interests wont go away quietly, but I hope you're right.


I doubt they will go away peacefully, either. I cannot think of an instance in history where they have though I’d love to be wrong.


I honestly cling to the hope daily that the internet has changed what people's reactions to power will be in the future. It's the one of the few deep things that's changed.


there's a dead post from @Ataturk who nails my arguments against your point, but doesn't do so very nicely.

history should be a guid here.

presuming penis-size conversation is anything more than an intentional dig by a chauvinist powermonger, probably isn't helpful here.


I don't see the dead post, though I've looked. I'm happy to hear the points.


after removing insulting remarks, ataturk's comments looks like this (which i personally agree with):

Globalism == lack of agency, the end of voting and elections.

Universal education == total indoctrination and ideological consistency of the collectivistic variety.

Universal healthcare == the end of individual choice, individual responsibility, all new extensive taxing regimes, the destruction of the individual on many levels.

Gun Control == People control. Then total domination by the state and a monopoly on the use of force. Which, by the way, has never worked out well for any citizen anywhere.


Thank you for taking the time to find and share the points. I'd like to rebuke :

>Globalism == lack of agency, end of voting and elections

If federalism didn't lead to this, why would globalism? What even is globalism? I am skeptical of this because "globalism" seems to be used as a catch-all alt-right bogeyman lately.

>universal education == total indoctrination, ideological consistency

Universal access to education doesn't have to mean homogeneous curriculum. It doesn't have to mean lack of education choices. It doesn't even have to mean requirement to receive education.

Another thing I find curious - the very people I find rejecting education "collectivism" are often people who wouldn't blink before forcing their own "indoctrination" upon others. We fight this fight in Texas far too often - should schools teach facts (evolution), or give "fair credence" to falsehoods (intelligent design) because of religion? Woops, out goes falsehoods as soon as the satanists get involved and also flex their constitutional rights, never mind, we'll take evolution!

>universal healthcare == the end of individual choice

I don't see why this has to be true, it isn't in any of the current implementations of universal healthcare.

>end of individual responsibility

Are you presently responsible for your water being clean? Are you presently responsible for the quality of your roadways? Are you presently responsible for the guarantee that your medication contains advertised active ingredients that do as it says on the label?

>Extensive taxing regimes

Shift military funding, close ultra-rich tax loopholes.

>destruction of the individual

Fail to see how this is true. Unsupported by argument.

>gun control == people control

False equivalence.

>then, total domination by the state and a monopoly on the use of force

Currently working just fine in countries with gun control laws, which, by the way, completely nullifies the false "has never worked out well" absolute you ended with. Furthermore, the USA has right now a monopoly on the use of force. The US and its laws have sovereign control over whether you are allowed to use force or not. You have almost no say in the matter. Furthermore, this isn't 1776. The gap in armament between civilians and a martial state is so large as to be hilarious. Were the need to arise, US combined armed forces could drone strike, Tomahawk, artillery, AC130, or just door to door ground and pound any organized militia to dust. Or, you know, nuke it.

Are you, private US citizen, allowed to build a nuke?

I see this kind of response a lot to "the globalist threat."

What's the end game, for people that believe these things? Endless culture war? A galaxy-spanning human civilization that still has their "I'm an American and you're Chinese" safe little lines? Is there a fear that the "wrong" culture will persist into the future?


Not the author of the dead post, and definitely not a fan of the alt-right, but the term globalism immediately resonated with me when I first heard it:

> What's the end game, for people that believe these things? Endless culture war? A galaxy-spanning human civilization that still has their "I'm an American and you're Chinese" safe little lines?

If we didn't have several superpowers on this planet, but only one unified government instead, what would whistleblowers do? Right now Snowden can fly to Russia, and Chinese dissidents can voice their opinions in the West. In a truly globalized world, there'd be nowhere to run.

We try to maximize competition and avoid monopolies and cartels in our economy. Why should we aim straight for the opposite when it comes to our governments?


The funding come from the DoD because any other appropriate entity is financially starved because the DoD gets so much money. The latest budget provides them:

$700,000,000,000

Or as Obama said in the 2016 State of the Union:

"We spend more on defense than the next 8 nations COMBINED."

Keep in mind, most of that 8 are allies.

I struggle to see why we should be praising a self-fulfilling prophecy.


Naming it 'defense' is already a huge stretch. I realize every country does this but it is as much or more about projecting power all across the globe as it is defense.


Fwiw, it used to be called the far more accurate Department of War.


George Carlin had a fantastic bit about this:

https://www.youtube.com/watch?v=hSp8IyaKCs0


> I realize every country does this but it is as much or more about projecting power all across the globe as it is defense.

I disagree that "every country" does this. Many countries in Europe and Asia do not use their defense forces as a tool for attacking other countries and getting involved in regional conflicts that were only affecting people in those countries. And countries that do it are usually just "helping" US-lead invasions of countries (an example would be Australia -- the single reason we were ever in the Vietnam War was because of the US).

(While I may be a biased given that America bombed my home country and engaged in a "peacekeeping mission" when I was a child, for a civil war that had been going on for many years and had nothing at all to do with the US, I never understood how Americans can see invasion of other countries as being anything other than that.)


> I disagree that "every country" does this.

I meant that they call their armies/airforces/navies 'defense'.

> Many countries in Europe and Asia do not use their defense forces as a tool for attacking other countries and getting involved in regional conflicts that were only affecting people in those countries.

I wished that were true, but almost all deployments of EU troops over the last 4 decades fall into that category. And there have been a lot of those.

> And countries that do it are usually just "helping" US-lead invasions of countries (an example would be Australia -- the single reason we were ever in the Vietnam War was because of the US).

Ditto Iraq, Afghanistan.

> While I may be a biased given that America bombed my home country and engaged in a "peacekeeping mission" when I was a child, for a civil war that had been going on for many years and had nothing at all to do with the US, I never understood how Americans can see invasion of other countries as being anything other than that.

Agreed, they are invasions, and the worst ones are the ones under some kind of pretext.

Where are you from originally?


There was a lot of EU and NATO forces involved in the Balkans not that long ago


It's not Defense. But I was trying to keep it simple.


Spending on its own isn't the only measure. The US is incredibly wasteful with its military spending where other countries with better political systems are more savvy. Plus the US has to invent 80% of the tech in the first place, something that costs 10x what it does to replicate it.


While the US military is wasteful, I don't think you'll find a military that is enormously less wasteful. After all, the largest line item in the DOD budget by far is salaries.


Cough Nimrod Cough La 85 Cough "the Age of the manned Fighter is over" plenty of countries have inefficient r screwed up military procurement.


Just imagine if we stopped spending on defense in the 80s when the cold war ended and we had to deal with today's Russia with decades old technology


I think we would be fine with a smaller military. Nobody is talking about completely stopping military expenditures. The question is whether it really needs to be as big as it is right now.


Overturning so many regimes in petrol filled countries is a messy business. I think a big chunk of the spending goes to intelligence and misinformation.


Personally I think a big chunk goes to not well thought out missions like Syria, Afghanistan or Iraq. It seems nobody is willing to admit that there is no path to winning or ending things so they just keep going and spending a lot of money and energy. And the defense contractors are certainly happy with this.


Or how about having to deal with the rapidly ascendant Chinese military that is annexing vast territory in Asia right now? There's nothing they respect except for strength, they're not even subtle about that fact.

Or if the US hadn't spent over a trillion dollars defending South Korea from the China, USSR, North Korea axis across decades. South Korea would not exist as we know it today. It'd more likely look like North Korea.

There's an exceptionally strong argument for the US working with regional military allies in Europe and Asia on defense and having a very powerful military to match its $20 trillion economy. Should the spending be more like $450 billion or $730 billion - that's the primary debate.


The Cold War ended in the 90s, not when the Wall came down. Now how would we possibly handle all those T-72s with just M1 Abrams? If anyone still fought conventional wars.


We aren't going to go to war with Russia, despite the propaganda that everyone is spreading.

Wars between nuclear countries don't happen any more.


The reason we aren't going to war with Russia is because Russian leaders know that it is impossible to win a war against the current US military.

The reason it's impossible to win a war against the current US military is because the US has spent an obscene amount of money building it for decades.

If the US military were not capable of defending Eastern Europe, Ukraine, Poland and the Baltics would be under Russian rule today.


No, it is because of the nukes.

Nukes prevent any war from happening between any powerful nation.

If we spent 1 tenth the amount on our military, we'd still have the power to blow up the entire world X times over.

I just think that if you have the ability to destroy the entire world, then that's a powerful enough military and you don't need anything more for defense.


You actually believe that the US would use nuclear weapons against Russia, ensuring the deaths of 10s of millions of Americans and sparking global nuclear war, if Russia invaded Estonia?

The only thing preventing Eastern Europe from being invaded is the conventional military. The US will never risk it's own existence for another country. The Russians know this.


I don't want the US to be involved in proxy wars with other countries.

So no, I don't believe that the US would do that and I am happy about that.

The US would only use Nukes if there was an attack on the US.

I want the US to stay out of all wars that aren't directed related to attacks on US soil and US citizens.

If the EU is worried about invasion, then they should have their own military to defend themselves.


I suspect conventional war is a thing of the past.

Cyber-warfare is going to be where it's at. You spend years infiltrating adversary networks, backdooring critical systems/infrastructure and exfiltrating intelligence so when the time comes for conflict, you already know what cards they're going to play, where all the pieces are on the map and you can shut it all down with a few commands. Then you start your blitzkrieg unopposed.

It's telling Russia, China and North Korea so heavily encourage cyber-espionage and malware development. Meanwhile in the US, we've got idiots passing laws criminalizing mere infosec research. We're handicapping ourselves against a threat our adversaries are all too eager to employ.


India and Pakistan went to war briefly in 1999 and both of them had nuclear weapons at the time (and still do, I think). I wouldn't rule out the possibility of future wars between them.

https://en.wikipedia.org/wiki/Kargil_War


That 700 billion seems small compared to the 2 trillion that we spend on Social Security + Medicare + Medicaid.


>the military funded the majority of the early advances in systems, networking, and cryptography

For the purposes of being better at waging war. I am not disagreeing with your statement, but I think leaving out the reason why is treating the military like it's "NASA with a few guns laying around" and implying that one can work on weapons systems without having any moral culpability for how they are used.


It's perfectly possible to remember every technological invention the military has ever been involved in and still object to participation in military research.

Possible reasons, using Wernher von Braun's rocket development as an example:

- He, or somebody else, would have invented rockets anyway at some point

- That 'some point' would have been much earlier if the money spent on military (research & some of the rest) would instead have been put into rockets in the first place

- even if the two mechanisms abovc were not true: I'd probably be willing to forgo the benefits of the space age if it meant undoing WW2 and the Holocaust

- In most ethical frameworks except utilitarianism, it's strongly discourage to accept some evil for some perceived greater good. Meaning: Even if rockets have saved far more lives than WW2 vanquished, participating in the project is morally dubious at best

Specifically to Google's project:

- This seems to actually invert the usual idea of the military funding basic research that later gets adapted for civilian use. They are building on the fundamental research into AI already done by Google and others, and using it very specifically to kill people. Any advances useful for other purposes would seem to be even more coincidental than usual.

- While "minimising collateral damage" i. e. civilian casualties may seem to be a worthy cause, it cannot be ignored that such advances are likely to result in greater use of the technology. Drones are actually the best example of this effect. Just look at the Obama administration's willingness to expand the use of drone strikes, which was a direct result of the technology appearing to be the lesser evil compared to manned missions on the ground.


You are saying that military R&D is important no matter what they are involved in. This incredibly stupid reasoning can be used to justify the development viruses that can erradicate the human species.

> DoD/Pentagon involvement in artificial intelligence research shouldn't be viewed as a necessarily bad thing.

Let's be spesific: they are using current AI algorithms to track everyone's movements from airborn imagery. It will be used to kill people in drone strikes and for population control.


As opposed to using dumb bombs which are far more likely to have collateral damage.


I'll take this over massive world wars.

I'd rather have neither, but I'm realistic.


I hardly think that this kind of technology will be useful in stopping a world war.


A guided tallboy on Hitler's bunker would have


One thing is acknowledging that if you pour lots of money into getting better at anything (space exploration, waging war, etc) you'll get cool technological advances applicable in other areas.

Another thing is to specifically choose to dedicate resources to one of those areas instead of another area. Why can't google bid on AI projects for better detection of geological landmarks in other planets? (which I'm guessing is a similar endeavor to what the article mentions - and already has ongoing research)


Steve Blank talks a lot about this history of Silicon Valley and how most of the original technology scene was started by DoD projects.

https://steveblank.com/secret-history/


Well the project "uses artificial intelligence to interpret video imagery and could be used to improve the targeting of drone strikes."

That's a direct application of tech to killing people, not some abstract network or compiler tech. There's a big difference.


The flip side is better targeting should result in less collateral deaths


Broken glass fallacy. Just because government and military funded some technology does not mean that technology would not be in an advanced state otherwise. Especially if you consider the countless failures and unnecessary excessive spending that took the resources indirectly from people and inventors. Besides brains that worked for military and government funded projects would perhaps find much better things to work on otherwise.


Military funding is used as a slight of hand. The legislature scares the public, the public hands over money, and then the military is used to invest the money in long term goals to build a future technological society that the private sector is wholly unwilling to do because of short term profit focus.

This also has the benefit of getting war averse liberals to say nice things about the large standing army and bloody foreign interventions of the US empire, most of which have failed to produce anything other than blowback and civil war across the globe.

https://archive.org/details/NoamChomsky-ScienceMilitaryFundi...


have you watched "Good Kill" or maybe even are familiar with the work of the Rosa-Luxemburg-Stiftung's research on US military drone program at Ramstein airbase? I guess not, because what you're proposing are further crimes against humanity. If the US wants AI/autonomous weapons then they should agree to have their troops held accountable at the The Hague International Court of Justice.

- The Global Assassination Grid: The Infrastructure and People behind Drone Killings: https://media.ccc.de/v/33c3-8425-the_global_assassination_gr...

- The Drone Papers: https://theintercept.com/drone-papers/

- Graphs, Drones & Phones: The role of social-graphs for Drones in the War on Terror: https://media.ccc.de/v/32c3-7259-graphs_drones_phones


> If the US wants AI/autonomous weapons then they should agree to have their troops held accountable at the The Hague International Court of Justice.

Why would US agree to that if Russia or China couldn't care less about it? That'd be just handicapping the military.


because it'd be less hypocritical when playing "world-police"


But that'd cripple the world-police and China or Russia would have much more leverage.


The military also funded advances in airplanes, trucks, and medicine, but that doesn't mean that doctors should have no qualms about working on biological weapons.

Also, many people, including myself, consider drone strikes to be state-sponsored terrorism. Just because some part of the military may be justifiable, does not mean all of it is.


Arguments from utility do not form a solid basis to argue about morality.


As Carl Jung might say, just because a thing has base origins, it doesn't mean that it has to be a base thing. Google should not shrug and say "from violence we came to violence we will return." I'm moved by the people speaking out, as much as I am disappointed to hear of Mr. Schmidt's involvement with the Pentagon.


They cost a great deal of R&D budget for their role in the development IT technology, and the larger part of development came from non military research, from academia and companies like Bell Labs.

The Google workers urging their CEO to pull out of Pentagon projects are the ones who do not have such short memories as to be distracted from the blatant deceptions and disasters carried out in recent history, with no sign of relenting.


This idea of justifying military buildup by it's beneficial trickle down into civilian technology always struck me as very odd. It seems like an absurdly inefficient and roundabout way of funding R&D. If the government wants to invest in research they should just invest in research.


Whilst agreeing in part, the history of martially-motivated technical development through all of human history is remarkable. I'd argue it's been a vastly greater influence than so-called market motivations.

As an evolutionary selective pressure, war and military necessity are unparalleled.

Re-reading Samuel B. Griffith's translation of The Art of War, I noticed that he mentions, in 1963, Joseph Needham's Science and Civilisation in China, which had only begun delivering volumes (the work remains in progress). That would include not only metalurgy, but entire volumes devoted to both martial and marine technologies.


It still is. It is not necessary for world peace to help the Pentagon to build better killing machines.


Pentagon and their killing machines is the #1 reason why there’s now peace here in Balkans, for 18 years and counting.

I don’t think people of ex-Yugoslavia would stop killing each other without being forced to.


'build better killing machines'. They are able to kill people well enough already.


Unless you're the one getting shot


You'd rather they had "worse" killing machines


Citation needed


Sure, I'll bite. Just falsify it: the Pentagon's killing machines have never brought about world peace, and world peace has never occurred while the Pentagon received killing machines.


You are blatantly embedding a false premise here; that the goal is or should be world peace.

That is an absolutely inane stance to take. How about instead of the obviously unattainable and unrealistic goal of "world peace", we just go for "more peace", or "less violent death" or similar goals. It is very hard to argue that American hegemony has not significantly reduced conflicts around the world. Not all of them, not equitably, and generally in pursuit of economic rather than ideological interests, but there is basically nothing to be gained by opening up the US to actually being threatened by competing powers.


You say that as if “world peace” was an attainable goal in any realistic timeframe. War doesn’t exist because people “hate peace”.


I didn't bring world peace into the thread, but at any rate there's also the saying that "All war represents a failure of diplomacy," which I tend to agree with. I also think "are indifferent toward" should replace "hate" there.


People also seem to forget that the Google self driving car was initially developed by the team that one the DARPA challenge. Government, especially the military have been funding all the risky tech that Google profits from. Even google itself was funded by a government grant. Google's logo has been on the side of rockets launching spy satellites for years. Google Maps was originally a government contractor. Google the company knows who butters their bread.

What the google employees need to do is push for a civilian government research and science funding effort, like ARPA was back in the day. Unfortunately that sort of funding is politically dead, while military funding is always in style.


Idealists on the left would apparently prefer China getting a complete monopoly on military AI and expanding their dictatorship globally instead of being pragmatic.

Reminder, if you work in tech you are collecting blood money because everything we use was initially funded by the military for military application. If you are going to attack Google for this and don't leave the industry you are a hypocrite


> This may be an uncomfortable fact but people have surprisingly short memories: the military funded the majority of the early advances in systems, networking, [...]

And what do we have to show for it? Wikipedia is great and Youtube is entertaining, but the rest of the internet is horrible. I say that as a software developer.

Due to the wonderful internet our children now have attention spans far below the time a typical class lesson takes. The smartest brains on Earth are not building spaceships or curing disease, but rather implementing systems to get you to click on ads. Our bodies and habits are being tracked, our opinions manipulated, and our privacy is being handed of willingfully for a chat and photo-sharing platform.

Military tech _often_ has implications far beyond what the original scientists and engineers intended. Sometimes for the best. But as you allude AI to the internet, don't forget how much damage the internet is doing to us even as we enjoy it and profit from it.


While that may be the reality of things, it surely can't be a justification in any way. If anything, I'd be afraid to revert back to those dark times where technological advancements were driven mainly by military needs.


Can you elaborate more on why you consider those to be "dark" times?


I strongly disagree with this comparison. In the prior, successful cases, the military funded research which was open, while in this case, the direction of information is reversed and the outcome is secret.


Or maybe they remember it perfectly, but don't consider it a valid argument to say, "X supported something you like 40 years ago, therefore X is beyond reproach"?


What's theoretical about drone strikes?


The military was also responsible for the miserable state of nuclear power in the United States.


yes but the weapon systems may be sold or transferred to some countries like Israel/Saudi Arabia eventually.. how would you view their use on protesters or civilians by them or would their be international laws that would prevent such use ?


What makes you think the signatories of the google letter have forgotten that?


Completely agree. If it isn't Google it'll be someone else.

I think DoD/Pentagon put up the funding for what later became Google maps. Not to mention the GPS constellation we all enjoy using.

Its a shame progress has to be made through these channels but it has proved very effective in the past.


If it isn't Google it'll be someone else.

The obvious reply is: great, then we're not needed, let someone else do it!


This way you end up with someone unskilled taking a job and instead of 1 drone you have to send 10 because only 1 will hit the target.


If our war machine kills less people, sounds like a win to me.


Not if these people are terrorists or enemy combatants.


Worse aim often means you’ll kill more people.


> Completely agree. If it isn't Google it'll be someone else.

That's an excuse people without any morals often use. It's up to each one of us to do the right thing.

After all: “The only thing necessary for the triumph of evil is that good men should do nothing” - or I would add that evil also rises when "good men" actually do the evil thing "because otherwise the bad men would do it anyway."

Google likely has the most advanced AI tech in the world right now. That means if that they allow the US government to use it, they will be directly responsible for accelerating the progress of autonomous killer robots or making them real in the first place. After all, I don't see too many other companies with AI that can learn the Go game within days and beat world champions.


Exactly. China just turned dictatorship, and has already installed AI facial recognition and soon voice recognition software for its police. This could allow China to better manage its Uighyr concentration camps and kidnap freedom fighters in Hong Kong. Along with its ambition to have everything 'Made in China' in 2025, and invade Taiwan, China is looking like a AI enhanced Hitler Germany 2.0.

We need weapons to fight an enemy like this, with help from US technology companies.


I am OK with weapons development. What I don't like is that these weapons get turned into big business selling them to unstable regions and in the case of the NSA the technology gets used against its own citizens.


This depiction of China is blown way out of proportion and it's basically thanks to US propaganda which is used to justify the kinds of military spending and activities it undertakes.


there is no inherent reason that these technologies need to be developed by the military, they could just as well be developed for civilian use when given the funding.

also it is not surprising that these things are invented by the dod when you're spending ~600-700 bilion a year on warfare and intelligence operations.

That last sentence of yours rubs me kinda the wrong way; that is a dangerious premise that could set us back ~30 years when the cold war and the arms race was well and alive, but this time we'd have robots and AI


China is currently heavily funding genetically engineering geniuses and AI. The US can either wake up or sit around and wait to be subjugated via an autonomous drone swarm developed by 250IQ Chinese scientists


I... hope this is sarcasm?

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: