Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
doctorpangloss
18 days ago
|
parent
|
context
|
favorite
| on:
Are OpenAI and Anthropic losing money on inference...
I’m pretty sure input tokens are cheap because they want to ingest the data for training later no? They want huge contexts to slice up.
awwaiid
17 days ago
[–]
Afaik all the large providers flipped the default to contractually NOT train on your data. So no, training data context size is not a factor.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: