Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
|
AYHL's comments
login
AYHL
11 months ago
|
parent
|
context
[–]
| on:
Reasoning models don't always say what they think
To me CoT is nothing but lowering learning rate and increasing iterations in a typical ML model. It's basically to force the model to make a small step at a time and try more times to increase accuracy.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: