Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Single medium-sized GPT-5 response can consume near 40 watt-hours of electricity (tomshardware.com)
4 points by myaccountonhn 4 days ago | hide | past | favorite | 2 comments




Says "medium" is 1,000-token. If I assume that takes 50 seconds, 40 Wh / 50 s = 2880 watts

Seems a bit much? Even if I assume 1/3rd is the amortised result of the training runs and 1/6th is the cooling system, that only takes me down to the power draw of 2.4 times the H200s' 600W maximum TDP* on the mean inference run.

Semi-related to main theme: why are the links so weird? For me at least, "the published findings" links to chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/https://arxiv.org/pdf/2505.09598v2 — and I opened this with Safari.

Going to the corresponding arxiv.org page manually, that's not even "the published findings" for the 40 Wh claim, it pre-dates GPT-5.

And the dashboard they link to doesn't even seem to make the claim of 40 Wh for a "medium" length response for GPT-5, unless I'm misreading this somehow, it's claiming 19.7 Wh for a "medium" length query?: https://app.powerbi.com/view?r=eyJrIjoiZjVmOTI0MmMtY2U2Mi00Z...

And I can believe 19.7 Wh. Same maths as above, if that takes 50 seconds to get an answer then it's 1.2 x (H200 TDP).

* https://www.techpowerup.com/gpu-specs/h200-nvl.c4254


Also, for comparison 3000 watts is like 2 washing machines or 3 microwaves. So close to most of the device in my kitchen turned on at the same time.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: