Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's interesting that the author is so disparaging of ChatGPT, when he himself had misremembered the title of the story as containing 'Michaelmas' and the importance of the goblins. What are these but hallucinations?

Obviously, it's no good that tools are offering useless results and making searches harder. There's also how you use the tools - asking GPT to one-shot find something that may not have been in the training data is a risk of hallucination that's easy to be aware about, while using AI to assist a web and archive search might have produced the same end result as what happened, a smart librarian kindly doing some searching.

Probably more fundamental breakthroughs are needed in how AIs 'know' stuff. But the gap between how humans and machines produce obscure half-remembered knowledge doesn't seem that big.



A tool needs to complement its user. A tool that has the same weaknesses as its user isn't useful.


He even emphasizes this in the piece — he extols the librarian as having the skills to recognize patterns in how people recall books, which includes imperfect memories.


I've found a great deal of use in working with other people, and they share my weaknesses far more than an LLM does. Even if I worked with exact duplicates of myself, I expect I (the one I, not the collection of clones) would still be more productive than if I worked alone- I routinely improve my own work when I read it back later after having lost context, so the context-free mes should be able to help in the same way.


It's worse than that. A hammer that shatters it's weak handle after normal use is a somewhat useful tool to the homebuilder. The hammer made out of painted cheese is only delaying construction of the house.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: