Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
EVa5I7bHFq9mnYK
on April 24, 2024
|
parent
|
context
|
favorite
| on:
Snowflake Arctic Instruct (128x3B MoE), largest op...
I've seen estimates that training gpt3 consumed 10GWh, while inference by its millions of users consumes 1GWh per day, so inference Co2 costs dwarf training costs.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: