As little as 2 years ago, $50-$75 was about the cheapest I could put a project on the internet, wired or wireless. Today its about $5. You can buy the whole little computer in the form of a TP-Link router running OpenWRT for about $22.
I have seen the future and its made of wifi.
How do you do it in $5 (or if that was an exaggeration, the cheapest possible cost)? I would like to do this.
If you want the whole computer instead of just wifi, try this:
OpenWRT preinstalled for your convenience.
That Ralink USB dongle looks like a great way to hook the Raspberry Pi up for internet, too.
One plot point is about "smart dust" being used for total surveillance (everyone's facial expression, heart rate, etc. being monitored for irregularities at all times) and then subverted via a backdoor thousands of years old that's activated by the right sequence of body movements...
Just because it is easy for you to imagine such a scenario doesn't mean that it is likely.
If the "horrifying...total surveillance" remark put you off, that was not at all meant to imply "ubiquitous networking must be bad because of this story, so let's prohibit it!".
In fact, in the story the smart dust, via the backdoor, provides the ability for the hero to break the (very, very nasty) oppressive regime.
For a more near future example Vinge presents the network in Rainbows End, especially given the work that's been done with Wifi meshes. Granted, I haven't heard any news about the idea lately, but as I recall, Google has one in Mountain View delivering free wifi to everyone in the area.
I believe this part (or something very much like it) is actually already feasible today).
In 1985, the physicist Richard Feynman calculated that the energy
efficiency of computers could improve over then-current levels by a
factor of at least a hundred billion (1011), and our data indicate that
the efficiency of computing devices progressed by only about a factor of
40,000 from 1985 to 2009
So this exponential trend is not gonna last long term... A decade at most.
"Theoretically, room‑temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media."
That would put a piece of memory changing at 1gbps at 2.85 billions of a watt of power with current technology.
Past that, modern CPUs are still horrendously inefficient with respect to how much data they're shuffling around to perform computations, since they're optimised for getting maximum performance out of a single thread.
A chip operating near the Landauer limit and efficiently computing useful results on top of that makes our current CPUs look like abacuses.
1 billion bits at 2.85 trillions of watt
-> 1 billion transistor CPU, uses 2.85 trillionths of a watt at 1 HZ
-> 1 billion transistor CPU, uses 2.85 thousandths of a watt at 1ghz
-> 1 billion transistor CPU, uses ~1/100 watt at 3.5ghz
So that puts us at a factor 10,000 away (billion transistor, 3.5ghz CPUs use ~100 watts at peak), assuming that all transistors are on. However, they're not nearly always on. Most are cache, most are logic that's not used, etc. Call it 5 orders of magnitude, or 100,000x.
The paper you're referencing seems to ignore leakage current etc, but I don't think for practical analysis of the field you'd want to do that, since obviously other factors are going to be improved as well.
Add in how inefficient our modern computation is, in terms of how many bits we're flipping around for the result we're computing, and I stand by my comment; theoretically optimum computation devices are going to make our current CPUs look like abacuses. Though we'll have to break out of our x86 and even CPU model to get there.
Furthermore, the things that have 'changed everything' in the last few years are all user interface related. As we get better at human speech recognition and creation, that will be when things truly are wholly changed.
However, one thing I do have to ask those who know better than me: Does the Landauer Limit imply that we won't be able to optimize any further, at a certain point, in terms of hardware capability? And would this include optimizing for size? In other words, if we hit that limit with something the size of Watson, could we reasonably still expect it to reach the size of a cell phone, half a century after that?
Do we really have chips that update a billion bits per second for only two billionths of a watt?
Also, reversible computing would take things much further without violating Landauer.
But if I'm right and "economic effect" translates roughly into broadly perceived wealth (of which I'm not sure), then energy efficiency isn't rising in any meaningful way.
Which computations? Computations to render another monster on the screen of someone playing Doom 3? Near-zero. Computations run on a quant's computer to hint at a market trend? Potentially massive.
It isn't a question you can answer in general. I doubt it's even a question you can approximate an answer to in any sane fashion.
You could compare the hardware requirements of today's best selling 3D shooter games to the hardware requirements of the original Doom. One hours worth of game play uses x CPU instructions, y kWh of energy and earns the maker of the game z dollars. You could do something similar to compare today's best quant trading desks to those of 10 or 15 years ago.
Actually you could just as well ignore CPU instructions entirely and just compute dollars earned per kWh for a few computing heavy industries. Obviously that's very crude and there are lots of missing variables but I don't think it's meaningless.