What we can do is to be neutral to how it’s used, and instead make sure that electricity is appropriately priced to cover its externalities. That means a carbon tax. The same solution is still the right one for Bitcoin as it is for other silly computer uses mentioned in this article. If it becomes too expensive to deliver Snapchat when electricity is properly priced then Snapchat won’t exist. So on.
so the point of the tax (reduce consumption) unfairly hits the poor.
And all this also avoids the core issue in actually deciding if the unpaid externalize of a unit of carbon is $200, $20, or $0.02; if you can't even get people to universally agree if there is a crisis in the first place than how on earth are you going to get them to agree on the monetary value of said crisis.
A carbon tax would be regressive for the American middle class. On the other hand, it would in fact be beneficial for the global south/poor. That is one of the difficult things about this, because a carbon tax would hit the middle class (note I didn't say working class) in the US which has quite a bit of power globally through their ability to elect the US president, so implementing any sort of carbon tax without giving them some deal will be moot politically. It would hit the working poor in the US pretty hard as well, although a good fraction of them don't own a car, so they wouldn't face a tax on the largest source of GHG's in the US.
 About 19% of people who earn less than 25K per year (yes! very poor) don't own a personal vehicle.
Since many US cities had public transit a century ago (and don't today), I wonder if you mean technically viable, economically viable, or politically attainable. I'm guessing the latter, and it's sad.
This seems like the most sensible route, rather than trying to dictate what is 'ok' and what is not.
That point is debatable because Bitcoin and snapchat are different in kind.
For bitcoin to function, it must be deliberately very computationally expensive. Generally speaking other software limits deliberately costly operations to certain functions (e.g. password hashing) and inefficiencies caused by of developers disinterested in performance can be reworked. Even the resource draw of a high end gaming for a few (or many) hours a week and a 24/7 miner are distinguishable in both raw and cost/benifit ways.
Technically I could (and probably would) still use my old Galaxy Nexus if I had a secure browser/mediaplayer and would only browse sites at the "complexity" of HN, had a chat app which wouldn't randomly hang the UI for seconds when task-switching (probably it would help having no emote-packs, videos-disguised-as-gifs and perfectly viable 720p-pictures as the inline-default) and the 3G network around was a little bit better... And yeah I (and a lot of other people) wouldn't miss a thing, GOOG and co. might have a little less cash in store and the craft of UX-designer might pay a little worse (BUT: all of this would have catastrophic consequences on the service oriented growth economy of the US which might not even hit Joe Doe very much if properly managed but would remove a little power of all the epsteins of this world, which of course can't happen...)
^ this much, (not to scale.)
Yes, cloud computing sucks power (which is then vented to the outside world, it could be recycled _and_ cut latency)
but you know what pushes out more CO2?
Poor house insulation.
I'm not unsympathetic to the need to run computing from renewable energy, but this article seems like a stretch, especially when it tries to link in government surveillance, etc.
And you know what emits more CO_2 than poor insulation? Transport! And what emits more CO_2 than transport? China!
With that logic, we'll never cut emissions anywhere.
For house applications, virtually any kind of efficiency gain that computerization gives us would offset the energy consumed in computation.
The problem is not the presence of computing, it's that fossil energy does not pay for its externalities.
Which isn't to say that computation-intensive ads aren't the worst thing ever.... And it does seem that some of the most hype-driven applications of digitization rely on exceedingly (and often intentionally) inefficient algorithms (blockchain and deep learning), but overall digitization can often be done with very little energy usage for the same benefit. I'm reminded of this cellphone that works on about 4 milliwatts. (although granted, it's using wireless electricity) https://www.wired.com/story/this-cell-phone-can-make-calls-e...
1) digitization (which includes tracking objects in the real world with IoT tech) increases efficiency, which results in moving fewer real objects around in the world, thus reducing total energy consumption. Think of Netflix vs manufacturing and shipping DVDs to stores then driving to the store to buy them.
2) data centers are far from the biggest carbon creators. Transportation and construction are, by far, #1
3) data centers can run off green energy. Combustion engine cars cannot.
4) data centers can be built in far away places that provide the most efficient and greenest energy
2)Sure, but data center consumption are added to transportation, construction...
3)Electric car can run on green energy too... does not change point 2)
4) You still try to compensate 2). And even data center running on green energy needs ressources and have effect.
The idea is not to not to get ride of digitization, but to keep in mind its consequences and don't see it as a magic thing that will solve all our problem and have no bad consequences
It'd be far more effective to have a Renewables revolution which pushes the scale of wind, solar, and batteries much higher. That would be far more cost-effective than a Luddite revolution which would ban computationally and data intensive practices in all the most valuable companies.
Electron arguments aside, computers need matter and a lot of that matter in them is rare, which requires things like strip mining and other non-Earth-friendly activities. If we manage to pull off a walking/talking/joke cracking AI, it and its little AI buds will likely demand every erg of power they can get. Sorta like my dog at dinner time.
Also, the amount of compute we are creating is accelerating at a non-linear rate, which is why there was a discussion about Bitcoin using so much energy at such and such a point in time in the future.
The speed of networking is not increasing at the same rate as compute.
Actually this is a completely inaccurate representation. The network to understand a human face can be transferred to a computer in milliseconds. The network to understand a human face gets baked into a human over many many months. Developing that network took a computer millions of images and several hours. Developing that network took biology many eons and billions of lifeforms.
EDIT: Following the general rquest of writing substantive comments, let's extend this.
The problem is that the authors see the cost of computers in other applications, but he doesn't see the benefits. For example using machine learning to detect face has a cost, but it can be applied to many things. From trivial applications like decorate your face in the phone to security applications like face recognition of justice fugitives, to more realistic faces in movies to deep fakes. You may like some of them and dislike other, but for some people the new technology provides benefits that hopefully are most than the cost.
For a journalist, probably the most important use of a computer is to write an article. Perhaps also to keep the small parts of info that must put in the article. Make some research by internet instead of traveling to a library. And modern cellphones are quite a powerful computer. You can perhaps replace the cellphone by a line phone, use a lot of cards for contacts and store information. But I think that the journalist will see immediately the benefits of using a computer to write the articles instead of a typewriter.