Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm torn between the awesomeness and the morality of working on such a project.


"But don’t let that distract you; it was designed to kill people."

https://www.calebthompson.io/talks/dont-get-distracted/


I wonder what kind of military technology is acceptable in today's standards of morality. This is just a drone. It's not autonomous. It's a remote control car with weapons or radars mounted on top, controlled by a human looking through a camera. The military use case would be to reduce the possibility of human casualties.

Not sure how better to word this. I literally do not understand what kind of military technology is acceptable. Is the common position that no war should exist, no guns should exist, no conflict should exist, and anyone else anywhere in the world that wants to hurt other people should be allowed to without repercussion? Is the idea that if we don't invest in military technology countries like Russia, China, North Korea, and Saudi Arabia will just make batteries and toys? What is the position I'm supposed to be aligned with?


Regardless of the fact that conflict is inevitable, the idea that I designed a device to take human life would probably weigh on my concious.

That's not to say I haven't considered it.


More accurately, it’s to reduce casualties for the military that uses them. It’s intended to increase casualties for the enemy.

Edit: it’s hard to reply when you keep editing your comments. It’s now quite different to your original post, though this comment does mostly make sense.


But that's what war is. I would imagine it's easier, and less of a risk to all humans involved on both sides, to create drones with non-lethal capabilities than to send non-lethal human forces into warzones.

Edit: I added the second paragraph to my original comment within 3 minutes of posting it, as I noticed there wasn't much hope of getting a response without fleshing out the question. I now have to choose between adding an additional reply to contend with the idea that I "keep editing my comments", or to edit this comment to deal with that additional claim.


I wonder though if the future will turn into the Star Trek episode where people are "killed" by an AI and then voluntarily submit to suicide. If war is too non-lethal it doesn't solve the reason for the war in the first place and eventually will lead to lethality anyway. As Sherman said "War is cruelty. There is no use trying to reform it. The crueler it is, the sooner it will be over." Perhaps more effort should go into eliminating the reason for war in the first place, which is of course easier said than done.


Hopefully it would be more like... eventually technology becomes so dominant on the battlefield that without it it’s just suicide. So as soon as one side’s drones are wiped out, they just surrender.

Although I suppose many countries would continue to resist in a Gandhi-like moral appeal to the victor’s population to stop the slaughter.


Pfff, that will never happen. Western people no longer understand the nature of war.

Even if one man with a nuke can effortlessly kill a million, a man with an axe can kill him as easily if he can manage to sneak close.


#dontbetheenemy


Theres nothing immoral about weapons development.

Warfare and its plagues has an always been a human problem.

Unless it violates personal beliefs akin to any conscientious objector.

I say this on a technology forum because, while i am a technologist now, i worked a past career doing weapons tech with bright and moral people. Weapons tech development has an always been a scary realm bc any force/battlefield superiority almost ALWAYS results in its use...anf subsequent loss of life, combatant and otherwise.

Its a best of the worst kind of scenerios.


You assert that there's nothing immoral about weapons development, and go on to almost directly contradict this- noting that "force/battlefield superiority" tends to lead to the use of these weapons, killing people. I really don't see what your point is; are you trying to argue that an occupation that leads to unnecessary human deaths is moral?


Unnecessary human death is a function of the policy makers. The weapons development is an enabler, but the 'why' behind the use of these weapons is where the moral obligation resides.


(Genuinely curious about how different people become confident in their moral and ethical convictions.)

Weapons development is not immoral because we have always waged war?


There's a great paradox with weapons. Imagine a world where somehow we agreed to stop development of any and all things that could be even somewhat effectively used as weapons, all the way down to the butter knife. The problem you've created now is that if we have just a single person decide to go against this and start developing weapons he would be able to forcefully impose his will as he saw fit with very little that could be done about it. And this is even worse when you take the analog away from individuals and consider it in terms of nations. Imagine one nation goes this direction. Now suddenly this one nation, which is probably a bit below par on the ethics meter, would suddenly be able to forcefully impose its will on the entirety of the world.

This means that it is a paradoxical imperative for ethical nations to maintain and develop as powerful of weapons as they can, even when they have absolutely 0 intent on ever using those weapons. Because otherwise, they risk falling victim to the lowest common denominator that does intend to use them. In reality this all gets really messed up, because as the person you were responding to said weapons almost invariably end up being utilized, often amorally, regardless of the player considered. So you're left in a scenario where if you don't develop weapons then you risk conceding the world to the least ethical player there is, yet if you do develop weapons then you are all but accepting that they will probably be used in an unethical fashion at some point.

Two pretty awful decisions, but I think one is worse than the other.


Since I am an American citizen, I am particularly interested in this question with respect to my country of citizenship.

Would you consider the United States an ethical nation?

Which nations do you consider to be ethical when it comes to this question?

Your argument makes sense in the abstract. However, I see the potential for inapplicability when applied to “real” nation states.


I'm not OP, but from my point of view, the United States has been one of the most likely nations in the world to use its weaponry on others (by my count, it's currently involved in 7 military operations [1]), and has been involved in dozens more over the last few decades. It has formulated dubious reasons, and in some cases outright fabrications, in order to justify some of these military actions, and in doing so, has destroyed countless lives, a number of those innocent ones. That's not ethical.

The reality of the world is that yes, there are actors out there who will do others harm if they can, and one of the duties of a nation is to protect its citizens. I am not opposed to a true defense force, to protect against invasion. But once that military power is projected outwards in order to fulfill political and economic goals unrelated to national defense, then a line has been crossed and that behaviour can hardly be considered ethical any more.

I should note that I don't consider my nation, Australia, to be ethical on these matters either based on past behaviours (sometimes being involved in these very same conflicts).

[1] Afghanistan, Iraq, Syria, Somalia, Libya, Niger, Yemen




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: