Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> There are no common ethics codes to determine how lethal autonomous weapons and systems that are developed for the military should be used once they end up in the hands of civilians.

It's interesting to me that this just presumes developing these autonomous weapons systems in the first place is ethical. I understand there is a difference of opinion on this ethical point, but it immediately frames the discussion pretty far away from the Hippocratic oath's requirement to abstain from causing harm.



"It should be noted that no ethically-trained software engineer would ever consent to write a DestroyBaghdad procedure. Basic professional ethics would instead require him to write a DestroyCity procedure, to which Baghdad could be given as a parameter."

- Borenstein


> pretty far away from the Hippocratic oath's requirement to abstain from causing harm

So does abortion and euthanasia, and probably plenty of other practices as well. Both of those are without doubt harm-causing practices, with their related points of controversy primarily revolving around whether the harm that is caused is worthwhile in the context of the alternative being a potentially greater harm.

Putting aside the fact that the Hippocratic oath is not actually a relevant part of modern medicine (modern doctors are accountable to comprehensive, codified sets of ethics), the fact that there is no such thing as a set of common ethics by which people choose to live their lives kinda points out the futility of this idea.

One person could say developing weapons is bad because they cause harm, another could say it’s good because they can be used to reduce harm that would have otherwise been caused. Who’s right? Neither of them. That’s just two people with different opinions. I would personally suggest that establishing moral authorities like can often be harmful, because lacking any objective truths, it’s a topic people should generally be left to make up their own minds about.

Am I right or wrong? Who’s to say? I’m just a person with an opinion, and so is anybody who would want to agree or disagree with me.


I think a different problem is, that it's not so clear why these weapons are being made or used.

I think the main motivator in almost all of those things is money.

The reason for wars is money, they just get justified by "the greater good".

Same for all the involved technologies.


> So does abortion and euthanasia, and probably plenty of other practices as well.

A less controversial example would be something like chemotherapy. In fact, a lot of treatments for terminal and chronic ailments are pretty harmful.


All 3 of those examples are actually covered by the “original” Hippocratic Oath (which probably wasn’t written by Hippocrates, incidentally).

> Neither will I administer a poison to anybody when asked to do so, nor will I suggest such a course. Similarly I will not give to a woman a pessary to cause abortion.

Chemo is obviously a bit different though, because the potential harm caused by denying abortion or euthanasia is (generally speaking) the potential to deny somebody the right to exercise a form of personal agency over their body/life. The controversy isn’t really a medical one.

The Oath also doesn’t really address treatments that have potentially harmful side effects, and it’s debated whether the oath allows doctors to perform surgery. It’s basically not fit for purpose in 2020. If you wanted to suggest that software engineers adopt a code of ethics similar to that of doctors, what you’d be really suggesting is something like “We need a AMA Code of Medical Ethics for software engineers”. Which obviously doesn’t have the same broad appeal and simplicity of an oath.


That was essentially my point. With euthanasia and abortion you will find plenty of people who would call them unambiguously harmful and think they should be banned outright, regardless of context.

You'd be hard pressed to find people who want to abolish the entire field of oncology on the grounds that the treatments are horrible.

It's absurd to take the above-stated "requirement to abstain from causing harm" as a hard restriction out of context without taking into account the main point of the profession which is to help the sick.

The logical conclusion of considering "do no harm" as inviolable above all else is that doctors would have to restrict their treatments to homeopathy and compassionate smiles.


I like the way you challenge the framing, and I agree, the right first question is "should we develop lethal autonomous weapons at all, and if so what kind is ok, what's the limit on that".

The way its asked looks like an attempt to shift the Overton window until autonomous weapons of all kinds are treated as a mundane inevitability not worth worrying about, with just the niggling details subject to ethical questioning.

But big shifts like that are exactly the sort of thing serious ethical codes should be used to watch out for. Not the niggling details afterwards.


Well for ine autonomous weapons were already there. Landmines for one. Back in the stone age even with snares even meaning /rope/ is an autonomous weapon.

There is no human in the loop (no pun intended for snares). It decides when to strike using physics and the answer is always "yes" if it is triggered.

What makes the new "autonomous" weapons different is that they attempt target differentiation. Mobility becomes useful then when weapons systems can say "no" when presented a target. Since even the Military Industrial Complex, purveyor of unneeded bullshit which wantonly takes lives would find it impossible to sell a drone that goes around shooting missles at all targets after launch.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: