Hacker News new | past | comments | ask | show | jobs | submit login

The responses on HN to this facial recognition technology vs China’s facial recognition technology is mind-boggling. Commenters saw the Chinese tech as dystopian (rightly so), but yet see this technology as a way to “to make sure we're getting the right people“, but that we still might want to think about how its use could eventually go “too far”.

If China’s facial recognition system is currently “too far”, how is this tech not also already too far? I guess if a technology is only used to recognize and assassinate foreign nationals, and not surveil citizens (which it will eventually be used to do), most Americans are okay with it. Some commenters are critical of this research, but the level of concern in these comments is way less than on posts about similar Chinese systems.

The point isn’t that this tech could increase accuracy and kill somewhat fewer civilians compared to the current amount of civilians killed regularly by U.S. drone and air attacks around the world. The entire basis for this activity - shooting missiles into civilian areas thousands of miles from home in endless wars - is the issue. The fact that the military sees a use for this kind of technology is the core of the problem, and no matter how well it works, it will only increase the efficiency of assassinations performed by the U.S. military, not abolish them.




[flagged]


> Technology is morally neutral

This argument needs to stop being used. Developed knowingly, a race-condition-full Therac-25 machine is objectively bad and a side-effect-free cancer cure is objectively good. It's along the same lines of "guns don't kill people, people kill people". It is a gross simplification of morality and ethical theories but nonetheless touted by people on the internet way too often. If anything it highlights the need for an ethics class as a requirement for any engineering degree.


> Developed knowingly, a race-condition-full Therac-25 machine is objectively bad

You've gone and proven my point with that "developed knowingly" disclaimer, because rather than merely a technology, you're describing a deceptive and harmful act committed by a human being.


Knowingly is not a disclaimer. Whether the developer intended it doesn’t change it’s an immoral technology if ever used. It is moreover absolutely not a proof or excuse for moral relativism.


I'm not advocating any sort of moral relativism. The act of intentionally building faulty medical equipment is morally wrong. Even the act of negligently building faulty medical equipment is morally wrong. But the faulty medical equipment itself is an inanimate object and it's a category mistake to ascribe moral judgment to it.

> Whether the developer intended it doesn’t change it’s an immoral technology if ever used.

Knowingly or negligently using unsafe medical equipment to treat patients is morally wrong, yes. You're again describing an act performed by a person and not the technology itself.



Who gets to decide who are terrorists and who are not? And based on what evidence? For very good reasons we have due process for that when it comes to our judicial system. As well the killings proceeded by the US are outside established international norms and even treaties sometimes.


Doing evil is generally enabled by such delusions of righteousness. Chinese state media undoubtedly frames their actions in a similarly justified way to how USian state media drones on about "terrorists".

I'm not equating the two entities - they certainly have drastic differences in aims, scope, chosen targets, etc. But rather evil is evil, and should be called out directly rather than excused for not being as severe as some other, especially more removed, evil.


That's fine. I don't think it's evil to kill terrorists. I especially don't think it's evil to develop more effective methods of identifying potential terrorists before killing them.

If anything's a delusion of righteousness, it's pacifism and isolationism.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: