Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I assume if their consumer facing AI is using Claude at all it would be a Sonnet or Haiku model from 1+ versions back simply due to cost.

I would assume quite the opposite: it costs more to support and run inference on the old models. Why would Anthropic make inference cheaper for others, but not for amazon?





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: