Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
cypress66
on March 14, 2023
|
parent
|
context
|
favorite
| on:
GPT-4
Not true if you tell it to first explain step by step (chain of thought) and only then answer.
slashdev
on March 15, 2023
[–]
I disagree, these kinds of models don’t do logical reasoning. What they do is predict the next word.
You can get it to give you its reasoning, but it’s bullshit dressed up to be believable.
Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: