Hacker News new | past | comments | ask | show | jobs | submit login

LLMs are, at least at present, exactly the kind of thing where trying to use an abstraction without understanding what it actually does is exactly what's going to create a mess in the long run.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: