Hacker News new | past | comments | ask | show | jobs | submit login

you really have to try hard to make a model hallucinate when asked to summarize an email. I think they didn't mention it because they can't guarantee 100%, but it's virtually on non-issue for such task.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: