Hacker News new | past | comments | ask | show | jobs | submit login

The issue (and I think what's behind the thinking of AI skeptics) is previous experience with the sharp edge of the Pareto principle.

Current LLMs being 80% to being 100% useful doesn't mean there's only 20% effort left.

It means we got the lowest-hanging 80% of utility.

Bridging that last 20% is going to take a ton of work. Indeed, maybe 4x the effort that getting this far required.

And people also overestimate the utility of a solution that's randomly wrong. It's exceedingly difficult to build reliable systems when you're stacking a 5% wrong solution on another 5% wrong solution on another 5% wrong solution...




Thank You! You have explained the exact issue I (and probably many others) are seeing trying to adopt AI for work. It is because of this I don't worry about AI taking our jobs for now. You still need somewhat foundational knowledge in whatever you are trying to do in order to get that remaining 20%. Sometimes this means pushing back against the AI's solution, other times it means reframing the question, and other times its just giving up and doing the work yourself. I keep seeing all these impressive toy demos and my experience (Angular and Flask dev) seem to indicate that it is not going to replace any subject matter expert anytime soon. (And I am referring to all the three major AI players as I regularly and religiously test all their releases).

>And people also overestimate the utility of a solution that's randomly wrong. It's exceedingly difficult to build reliable systems when you're stacking a 5% wrong solution on another 5% wrong solution on another 5% wrong solution...

I call this the merry go round of hell mixed with a cruel hall of mirrors. LLM spits out a solution with some errors, you tell it to fix the errors, it produces other errors or totally forgets important context from one prompt ago. You then fix those issues, it then introduces other issues or messes up the original fix. Rinse and repeat. God help you if you don't actually know what you are doing, you'll be trapped in that hall of mirrors for all of eternity slowly losing your sanity.


and here we are arguing for internet points.


Much more meaningful to this existentialist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: