A weird question, but since there's another article on HN right now about programming language energy efficiency https://news.ycombinator.com/item?id=24816733 any idea whether going from 9fps to 1840fps consumes the same power, 200x the power, or somewhere in between?
Great question, now I wish I'd recorded power consumption for all these experiments.
Judging from cumulative hours of watching the output of nvidia-smi I've definitely seen a linearish relationship between utilization and power draw (with a non-zero floor of 30-40W).
I see Rust is almost equal to C if not better in the graph, however, I think equally-skilled programmers in either language would show the Rust programmer spending more 'energy' programming and iterating than the C programmer, but then make the argument that the C program will use more 'energy' downstream if bugs slip in. In any case, an eye-opening metric on what I, and I am sure many, take for granted. Cool.