Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jgeralnik
2 days ago
|
parent
|
context
|
favorite
| on:
Prompt caching for cheaper LLM tokens
Anthropic requires explicit cache markers but will “look backwards” some amount, so you don’t need to fall on the exact split to get cached tokens
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: