Hacker News new | past | comments | ask | show | jobs | submit login

I'm not arguing for a ban, but I am trying to argue for a more nuanced view about technology and responsibility.

No, LLMs are not guns. That said, any responsible and conscious designer of LLMs should know they impose other risks, like the ones they pose regarding misinformation and democracy. They should also know that they pose risks regarding economic stability and copyright law and plagiarism.

You could try to weigh this against the benefits, but the fact of the matter is, the responsible way to develop these technologies is to show that you are explicitly accounting for these obvious dangers as well. Very few companies actually do that (because they selfishly choose to optimize for profit instead).

My main point is just that the user isn't the only person with some amount of responsibility when it comes to technology. Designing a tool does not therein automatically absolve you of any moral responsibility.

I think the reasonable approach is regulation—it is the only thing that helps combat sheer profit motive and force companies to attend to at least some of this responsibility.






Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: