Hacker News new | past | comments | ask | show | jobs | submit login

A weird question, but since there's another article on HN right now about programming language energy efficiency https://news.ycombinator.com/item?id=24816733 any idea whether going from 9fps to 1840fps consumes the same power, 200x the power, or somewhere in between?



Great question, now I wish I'd recorded power consumption for all these experiments. Judging from cumulative hours of watching the output of nvidia-smi I've definitely seen a linearish relationship between utilization and power draw (with a non-zero floor of 30-40W).


I see Rust is almost equal to C if not better in the graph, however, I think equally-skilled programmers in either language would show the Rust programmer spending more 'energy' programming and iterating than the C programmer, but then make the argument that the C program will use more 'energy' downstream if bugs slip in. In any case, an eye-opening metric on what I, and I am sure many, take for granted. Cool.


EDIT: I think Zig would come up pretty good here too:

https://twitter.com/andy_kelley/status/1317586767260774400




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: