"What LLMs are currently capable of producing is industrially scaled, industrial-grade bullshit. That’s troublesome for many reasons, not least of which is that humans have enough trouble discerning the age-old artisanal variety. Every human is required to make a zillion tiny decisions every day about whether some notion they’re presented with should be believed, and rarely do they have the opportunity or desire to stop, gather all the relevant information, and reason those decisions from first principles. To do so would pretty much halt human interaction as we know it, and even trying would make you pretty annoying."
"What LLMs are currently capable of producing is industrially scaled, industrial-grade bullshit. That’s troublesome for many reasons, not least of which is that humans have enough trouble discerning the age-old artisanal variety. Every human is required to make a zillion tiny decisions every day about whether some notion they’re presented with should be believed, and rarely do they have the opportunity or desire to stop, gather all the relevant information, and reason those decisions from first principles. To do so would pretty much halt human interaction as we know it, and even trying would make you pretty annoying."