Hacker News new | past | comments | ask | show | jobs | submit login

It’s possible to prove.

Use an LLM to do a real world task that you should be able to achieve by reasoning.




> Use an LLM to do a real world task that you should be able to achieve by reasoning.

Such as explaining the logical fallacies in this argument and the one above?


Take anything, see how far you get before you have to really grapple with hallucination.

Once that happens, your mitigation strategy will end up being the proof.


I mean I know you're joking but yes, it would be able to do that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: