Hacker News new | past | comments | ask | show | jobs | submit login

Perplexity does a pretty good job on this. I find myself reaching for it first when looking for a factual answer or doing research. It can still make mistakes but the hallucination rate is very low. It feels comparable to a google search in terms of accuracy.

Pure LLMs are better for brainstorming or thinking through a task.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: