If a typical neuron requires 10^6 ATP per activation , and if it takes 30.5 kJ/mol to charge ATP , and if a typical neuron has 100 axons, each of which is contributing one FLOP, then I _think_ a human neuron is about 500 times as efficient as a GV100 , at 200,000 GFLOPS per watt.
10E6 ATP = 1 activation = 100 FLOP
30.5 kJ = 1 mole ATP = 6E23 ATP
1 kJ = 0.28 Wh
2E5 GFLOPS = 1 W
GV100 "Tensor Core":
4E2 GFLOPS = 1 W
It's more useful to loop at synaptic operations. 1 FLOP per synaptic activation is a vast underestimate of what happens computationally in the brain. Synaptic activations have many homeostatic side effects.
One can also not discount the cost of the memory needed for the tensor core.
* memory isn't accounted for
* FLOP per activation is estimated at 100
* neurons can vary by a factor of ~1000 in how much energy they require to activate, based on their size and speed
* ATP: I'm not really sure what to make of the energy here; I used 30kJ / mol. Wikipedia says "The hydrolysis of ATP into ADP and inorganic phosphate releases 30.5 kJ/mol of enthalpy, with a change in free energy of 3.4 kJ/mol." What's the efficiency of our digestive & recharging systems -- maybe 5% - 30%?
I think you have a much more sophisticated understanding of neurons than I do!
This Mead paper suggests "...a factor of 10 million more efficient than the best digital technology that we can imagine"...which I interpreted as a gross approximation at the chip level assuming 10nm process, not considering any particular architecture, and doesn't include roughly 2-3 orders of magnitude reduced efficient at the system level.
Efficient Processing of Deep Neural Networks: A Tutorial and Survey
Vivienne Sze, Yu-Hsin Chen, Tien-Ju Yang, Joel Emer