Hacker News new | past | comments | ask | show | jobs | submit login

This is like saying a gun that appears safe but that can easily backfire unless used by experts is completely fine. It's not an issue with the gun, the user should be competent.

Yes, it's technically true, but practically it's extremely disingenuous. LLMs are being marketed as the next generation research and search tool, and they are superbly powerful in the hands of an expert. An expert who doesn't blindly trust the output.

However, the public is not being educated about this at all, and it might not be possible to educate the public this way because people are fundamentally lazy and want to be spoonfed. But GPT is not a tool that can be used to spoonfeed results, because it ends up spoonfeeding you a whole bunch of shit. The shit is coated with enough good looking and smelling stuff that most of the public won't be able to detect it.






It does not appear safe. It clearly says at the bottom that you should checkup important facts.

I have in my kitchen several knives which are sharp and dangerous. They must be sharp and dangerous to be useful - if you demand that I replace them with dull plastic because users might inadvertantly hurt themselves, then you are not making the world a safer place, you are making my kitchen significantly more useless.

If you don't want to do this to my physical tools, don't do this to my info tools.


I attempted to respond with extending the knife-analogy, but it stops being useful for LLMs pretty quick since (A) the danger is pretty obvious to users and (B) the damage is immediate and detectable.

Instead it's more like lead poisoning. Nobody's saying that you need a permit to purchase and own lead, nor that you must surrender the family pewter or old fishing-sinkers. However we should be doing something when it's being marketed as a Miracle Ingredient via colorful paints and cosmetics and dusts and gases of cheap gasoline.


Ah, because some text saying "cigarettes cause cancer" is all that's needed to educate people about the dangers of smoking and it's not a problem at all if you enjoy it responsibly, right?

I'm talking about the industry and a surrounding crowd of breathless sycophants who hail them as the second coming of Christ. I'm talking about malign comments like "Our AI is so good we can't release the weights because they are too dangerous in the wrong hands".

Let's not pretend that there's a strong and concerted effort to educate the public about the dangers and shortcomings of LLMs. There's too much money to be made.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: