Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Not true if you tell it to first explain step by step (chain of thought) and only then answer.



I disagree, these kinds of models don’t do logical reasoning. What they do is predict the next word.

You can get it to give you its reasoning, but it’s bullshit dressed up to be believable.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: