Hacker News new | past | comments | ask | show | jobs | submit login
AI Chatbot Joins Mushroom Hunters Group, Encourages Cooking Dangerous Mushroom (gizmodo.com)
21 points by ColinWright 15 hours ago | hide | past | favorite | 5 comments





How is it still news that LLMs hallucinate?

Especially on shrooms

More importantly who put it in the group?

> A moderator for the group said that the bot was automatically added by Meta and that “we are most certainly removing it from here.” Meta did not immediately respond to a request for comment.

Whatevers, AI encourages eating dangerous mushroom, there's also the book it wrote that poisoned people https://news.ycombinator.com/item?id=41269514

Logically it's doing this in every conversation.

Millions of incorrect facts being pushed to people. Except it has billions of disconnected person hours of data to get wrong.

It's not stuff like "gypsum is good for clay soil" or "weed tea helps plants grow". These are wives tales and religion like culture. Still structured and as a gardener you can process

With AI there is no culture, it'll tell you something wrong no one else will ever hear. It has no meaning. You're not in the plant when there's a full moon club. It'll be 'smore pop tarts help tomato's grow' club of one.

It also gives you balls to just go do stuff. So tomato tomato. It's no doubt convinced a few people to get cancer checks. But less nihilism would be nice.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: