Hacker News new | past | comments | ask | show | jobs | submit login

I'm at the state of thinking that I am quite happy to let them screw themselves with it. I am very good at clearing up disasters and getting paid a hell of a lot for it as the deciding factor isn't your ability to use an LLM but to know what the hell you are doing. We have had quite a few disasters due to inexperienced and experienced people throwing stuff into an LLM and assuming it has any veracity or authority over what comes out.

I tried warning at first and reinforcing validation but I was poo pooed as a spoilsport luddite with basically a faith argument. Not my fucking funeral!






Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: