Let us assume that 100% of all mining rewards go to electricity (and none to buying mining equipment, paying employees, servers, or networking), and let us further assume the electricity costs $0.01 / KWhr, which is cheaper than one can buy basically anywhere.
From flippening.watch, BTC paid miners $20,797,200 yesterday, for 2,079,720,000 KWHrs. This translates to approximately 87 GW of electricity, 24/7. 87 GW is approximately the output of the 21 Palo Verde nuclear plants in Arizona, which according to the US Energy Information Administration, can output up to 3937 MW. https://www.eia.gov/tools/faqs/faq.php?id=104&t=3
In other words, Bitcoin is a drop in the bucket. Let's put this myth of massive energy usage to rest, and at the cost of being too snarky, and let us ask our journalists to learn basic math.
edit - I made a math mistake, which I have corrected. The overall point, while I was off by several orders of magnitude, however, holds.
edit 2 - I retract the entire comment. It appears the article is true, and that, ironically, I made the same mistake I accused the journalist of. I'm going to leave the comment up for posterity, but I retract it.
Your contention that the Guardian article made an "outrageously large" estimate of bitcoin electricity consumption is contradicted by your own estimate, after you fixed the units-conversion mistake in it.
My calculator and Google (search for "2079720000 kWh per 24 hours to MW") says it's more than 86,000 MW (so ~86 GW).
edit: corrected units from TW to GW, as pointed out by philipkglass
EDIT: This is too perfect. 1053r calls out journalists for making a basic math mistake. In doing so, he makes a basic math mistake. I ask him about it, then get downvoted by HN.