Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
sabas123
1 day ago
|
parent
|
context
|
favorite
| on:
Modern-Day Oracles or Bullshit Machines? How to th...
If you make an LLM which design goal is to state "I do not know" any answer that is not directly in its training set, then all of the above statements don't hold.
Consider applying for YC's Spring batch! Applications are open till Feb 11.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: