

The Computing Trend that Will Change Everything - Brajeshwar
http://www.technologyreview.in/business/40016/

======
noonespecial
Ubiquitous wireless networking does more to create a nano-sensors everywhere
future than cheap supercomputing. Micro-controllers have had more than enough
computing power to tell me the temperature in the back yard with battery power
for decades now. It was sending the data anywhere usefull that was hard.

As little as 2 years ago, $50-$75 was about the cheapest I could put a project
on the internet, wired or wireless. Today its about $5. You can buy the whole
little computer in the form of a TP-Link router running OpenWRT for about $22.

I have seen the future and its made of wifi.

~~~
india
> As little as 2 years ago, $50-$75 was about the cheapest I could put a
> project on the internet, wired or wireless. Today its about $5.

How do you do it in $5 (or if that was an exaggeration, the cheapest possible
cost)? I would like to do this.

~~~
noonespecial
Here you go. How bout $1.46. This assumes of course that your project has USB
and you know how to handle the Ralink chipset. If not, just buy the TP-Link
703 and hook it to your arduino.

[http://www.amazon.com/SANOXY-
USB2-0-Wireless-802-11-Adapter/...](http://www.amazon.com/SANOXY-
USB2-0-Wireless-802-11-Adapter/dp/B004I8B8Z6/ref=sr_1_4?ie=UTF8&qid=1333957173&sr=8-4)

If you want the whole computer instead of just wifi, try this:

[http://www.volumerates.com/product/tp-link-tl-
wr703n-openwrt...](http://www.volumerates.com/product/tp-link-tl-
wr703n-openwrt-compatible-pocket-11n-150m-3g-mobile-wireless-broadband-router
--blue-ship-with-openwrt-pre-installed-upon-customers-request-103048)

OpenWRT preinstalled for your convenience.

~~~
pyrhho
Cool! Just bought a tp-link wr703n for use as an Arduino wifi bridge (24 USD).
We'll see how it goes. Thanks for the tip!

That Ralink USB dongle looks like a great way to hook the Raspberry Pi up for
internet, too.

------
brazzy
For a fascinating (and also horrifying) view at how truly ubiquitous computing
could look like, read "A Deepness in the Sky" by Vernor Vinge (published
1999!).

One plot point is about "smart dust" being used for total surveillance
(everyone's facial expression, heart rate, etc. being monitored for
irregularities at all times) and then subverted via a backdoor thousands of
years old that's activated by the right sequence of body movements...

~~~
VMG
I think you are falling victim to the availability heuristic here[1]

Just because it is easy for you to imagine such a scenario doesn't mean that
it is likely.

1: <http://en.wikipedia.org/wiki/Availability_heuristic>

~~~
Dysiode
One could argue that applies to sci-fi as a whole, but some how Vingie's
vision of omnipresent dust motes seems more likely (certainly given several
hundreds years of cyclical progress, as in the story) than teleporters or even
flying cars. Certainly the conditions in the novel were extremely conducive to
that type of a network but such a ubiquitous, space filling network could be
useful in plenty of other situations (rescue operations or something as simple
as adding a puff of motes to a high traffic area to bolster a network for
example). Is it likely? Probably not. Certainly not without wireless power,
but it's probably more likely than flying cars as our primary mode of
transportation.

For a more near future example Vinge presents the network in Rainbows End,
especially given the work that's been done with Wifi meshes. Granted, I
haven't heard any news about the idea lately, but as I recall, Google has one
in Mountain View delivering free wifi to everyone in the area.

~~~
brazzy
In the book, wireless power was provided externally via microwave bursts
(thus, requiring the government's being convinced to deliberately deploy it).

I believe this part (or something very much like it) is actually already
feasible today).

------
sbierwagen

      In 1985, the physicist Richard Feynman calculated that the energy 
      efficiency of computers could improve over then-current levels by a 
      factor of at least a hundred billion (1011), and our data indicate that 
      the efficiency of computing devices progressed by only about a factor of 
      40,000 from 1985 to 2009
    

Copy-editing fail. Presumably he wanted 10^11, or a <sup> on the 11, there.

~~~
sparky
CSS fail, unfortunately, which is likely not the author's fault. The included
style.css sets "vertical-align: baseline" on pretty much everything, including
sub and sup.

------
dchichkov
There is a physical limit on the efficiency of computation - the Landauer
limit. Modern chips already got within 3 orders of magnitude to that limit.

So this exponential trend is not gonna last long term... A decade at most.

~~~
reitzensteinm
We're way further than 3 orders of magnitude away.

"Theoretically, room‑temperature computer memory operating at the Landauer
limit could be changed at a rate of one billion bits per second with only 2.85
trillionths of a watt of power being expended in the memory media."

That would put a piece of memory changing at 1gbps at 2.85 billions of a watt
of power with current technology.

Past that, modern CPUs are still horrendously inefficient with respect to how
much data they're shuffling around to perform computations, since they're
optimised for getting maximum performance out of a single thread.

A chip operating near the Landauer limit and efficiently computing useful
results on top of that makes our current CPUs look like abacuses.

~~~
dchichkov
From doi:10.1038/nature10872: "From a technological perspective, energy
dissipation per logic operation in present-day silicon-based digital circuits
is about a factor of 1000 greater than the ultimate Landauer limit".

~~~
reitzensteinm
I ran the numbers, and while you're way closer than I was, it still seems to
me it's way more than 1000.

1 billion bits at 2.85 trillions of watt

-> 1 billion transistor CPU, uses 2.85 trillionths of a watt at 1 HZ

-> 1 billion transistor CPU, uses 2.85 thousandths of a watt at 1ghz

-> 1 billion transistor CPU, uses ~1/100 watt at 3.5ghz

So that puts us at a factor 10,000 away (billion transistor, 3.5ghz CPUs use
~100 watts at peak), assuming that all transistors are on. However, they're
not nearly always on. Most are cache, most are logic that's not used, etc.
Call it 5 orders of magnitude, or 100,000x.

The paper you're referencing seems to ignore leakage current etc, but I don't
think for practical analysis of the field you'd want to do that, since
obviously other factors are going to be improved as well.

Add in how inefficient our modern computation is, in terms of how many bits
we're flipping around for the result we're computing, and I stand by my
comment; theoretically optimum computation devices are going to make our
current CPUs look like abacuses. Though we'll have to break out of our x86 and
even CPU model to get there.

------
fauigerzigerk
The question is, what is the economic effect of a given number of
computations? I believe it is decreasing at least as fast as energy efficiency
is increasing. I don't have any data to back it up though.

But if I'm right and "economic effect" translates roughly into broadly
perceived wealth (of which I'm not sure), then energy efficiency isn't rising
in any meaningful way.

~~~
derleth
> The question is, what is the economic effect of a given number of
> computations?

Which computations? Computations to render another monster on the screen of
someone playing Doom 3? Near-zero. Computations run on a quant's computer to
hint at a market trend? Potentially massive.

It isn't a question you can answer in general. I doubt it's even a question
you can approximate an answer to in any sane fashion.

~~~
fauigerzigerk
I don't think it's that difficult and not even that different between
industries because it's all relative to what is possible. Both 3D shooter
games and quant traders always needed to be in about the same place on the
technology curve in order to succeed economically.

You could compare the hardware requirements of today's best selling 3D shooter
games to the hardware requirements of the original Doom. One hours worth of
game play uses x CPU instructions, y kWh of energy and earns the maker of the
game z dollars. You could do something similar to compare today's best quant
trading desks to those of 10 or 15 years ago.

Actually you could just as well ignore CPU instructions entirely and just
compute dollars earned per kWh for a few computing heavy industries. Obviously
that's very crude and there are lots of missing variables but I don't think
it's meaningless.

