Hacker News new | past | comments | ask | show | jobs | submit login

> Firstly, it's about time we stop pretending that people wait until technology does something better than humans do before they deploy it.

I think we need to stop pretending that humans are perfect, or even acceptably good.

Computers get better as time goes off. This technology wasn't a thing ten years ago. Now it's questionable. In ten years it'll be better than it was today. Humans will be exactly as good in ten years as they were today and exactly as good as they were a hundred years ago. And what they are today is not good enough.

> identifying people who do not want to be found, at long range, for the purpose of assassination

I think we need to stop pretending that humans are paragons of virtue. They do things we should be concerned about even when they can't execute effectively.

Long-range identification and assassination of people who don't want to be found is a capability that already exists, and in fact has existed for centuries - for a given value of "exists". Sniper teams, helicopter gunships, and artillery spotters have precisely this role and they make mistakes and kill the wrong people all the damn time.

And on top of that... when was the last time you heard about some US soldiers committing war crimes, in person, with their bare hands? No technology involved at all. No technology needed. Stop blaming it for human failings.




> Computers get better as time goes off. This technology wasn't a thing ten years ago. Now it's questionable. In ten years it'll be better than it was today. Humans will be exactly as good in ten years as they were today and exactly as good as they were a hundred years ago. And what they are today is not good enough.

Okay. Automated call systems have been deployed for decades now and continue to increase their market penetration. They are still not nearly as useful or good at what they do as a human would be, but that did not make any significant difference to the speed and rate at which they were deployed. Your argument is that abstractly, at some point, they should be better than humans. Maybe so, but that's not what I was arguing--I was arguing that the conditions for a technology's deployment are only distantly correlated with how good they are in comparison to humans performing the same task, not making some philosophical point about how the machines will not replace us or whatever.

> And on top of that... when was the last time you heard about some US soldiers committing war crimes, in person, with their bare hands? No technology involved at all. No technology needed. Stop blaming it for human failings.

I'm not really going to bother to respond in depth to the rest of the stuff you said since it seems to be responding to points I did not make (when did I ever say people were paragons of virtuous, hadn't killed people before, didn't make mistakes, or didn't use technology to kill people?). I am merely pointing out that technology is not value neutral; the particular technology we are talking about is explicitly designed to do pretty awful things. Responding that the real problem is people is missing the point; it's another iteration of the "guns don't kill people" argument.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: