Hacker News new | past | comments | ask | show | jobs | submit login

The difference is everything. It doesn't understand intent, it doesn't have a motivation. This is no different than what fiction authors, songwriters, poets and painters do.

The fact that people assume what it produces must always be real because it is sometimes real is not its fault. That lies with the people who uncritically accept what they are told.




> That lies with the people who uncritically accept what they are told.

That's partly true. Just as much fault lies with the people who market it as "intelligence" to those who uncritically accept what they are told.


This is displayed directly under the input prompt:

ChatGPT may produce inaccurate information about people, places, or facts.


That's a good start. I think it needs to be embedded in the output.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: