Hacker News new | past | comments | ask | show | jobs | submit login
The Computing Trend that Will Change Everything (technologyreview.in)
65 points by Brajeshwar on Apr 9, 2012 | hide | past | web | favorite | 24 comments



Ubiquitous wireless networking does more to create a nano-sensors everywhere future than cheap supercomputing. Micro-controllers have had more than enough computing power to tell me the temperature in the back yard with battery power for decades now. It was sending the data anywhere usefull that was hard.

As little as 2 years ago, $50-$75 was about the cheapest I could put a project on the internet, wired or wireless. Today its about $5. You can buy the whole little computer in the form of a TP-Link router running OpenWRT for about $22.

I have seen the future and its made of wifi.


> As little as 2 years ago, $50-$75 was about the cheapest I could put a project on the internet, wired or wireless. Today its about $5.

How do you do it in $5 (or if that was an exaggeration, the cheapest possible cost)? I would like to do this.


Here you go. How bout $1.46. This assumes of course that your project has USB and you know how to handle the Ralink chipset. If not, just buy the TP-Link 703 and hook it to your arduino.

http://www.amazon.com/SANOXY-USB2-0-Wireless-802-11-Adapter/...

If you want the whole computer instead of just wifi, try this:

http://www.volumerates.com/product/tp-link-tl-wr703n-openwrt...

OpenWRT preinstalled for your convenience.


Cool! Just bought a tp-link wr703n for use as an Arduino wifi bridge (24 USD). We'll see how it goes. Thanks for the tip!

That Ralink USB dongle looks like a great way to hook the Raspberry Pi up for internet, too.


For a fascinating (and also horrifying) view at how truly ubiquitous computing could look like, read "A Deepness in the Sky" by Vernor Vinge (published 1999!).

One plot point is about "smart dust" being used for total surveillance (everyone's facial expression, heart rate, etc. being monitored for irregularities at all times) and then subverted via a backdoor thousands of years old that's activated by the right sequence of body movements...


I think you are falling victim to the availability heuristic here[1]

Just because it is easy for you to imagine such a scenario doesn't mean that it is likely.

1: http://en.wikipedia.org/wiki/Availability_heuristic


Where did I say that anything in particular is likely?

If the "horrifying...total surveillance" remark put you off, that was not at all meant to imply "ubiquitous networking must be bad because of this story, so let's prohibit it!".

In fact, in the story the smart dust, via the backdoor, provides the ability for the hero to break the (very, very nasty) oppressive regime.


One could argue that applies to sci-fi as a whole, but some how Vingie's vision of omnipresent dust motes seems more likely (certainly given several hundreds years of cyclical progress, as in the story) than teleporters or even flying cars. Certainly the conditions in the novel were extremely conducive to that type of a network but such a ubiquitous, space filling network could be useful in plenty of other situations (rescue operations or something as simple as adding a puff of motes to a high traffic area to bolster a network for example). Is it likely? Probably not. Certainly not without wireless power, but it's probably more likely than flying cars as our primary mode of transportation.

For a more near future example Vinge presents the network in Rainbows End, especially given the work that's been done with Wifi meshes. Granted, I haven't heard any news about the idea lately, but as I recall, Google has one in Mountain View delivering free wifi to everyone in the area.


In the book, wireless power was provided externally via microwave bursts (thus, requiring the government's being convinced to deliberately deploy it).

I believe this part (or something very much like it) is actually already feasible today).


But still, it's a fantastic book :)


See also: Vinge's Rainbows End for a near future example of ubiquitous networking (amongst other likely visions).


  In 1985, the physicist Richard Feynman calculated that the energy 
  efficiency of computers could improve over then-current levels by a 
  factor of at least a hundred billion (1011), and our data indicate that 
  the efficiency of computing devices progressed by only about a factor of 
  40,000 from 1985 to 2009
Copy-editing fail. Presumably he wanted 10^11, or a <sup> on the 11, there.


CSS fail, unfortunately, which is likely not the author's fault. The included style.css sets "vertical-align: baseline" on pretty much everything, including sub and sup.


There is a physical limit on the efficiency of computation - the Landauer limit. Modern chips already got within 3 orders of magnitude to that limit.

So this exponential trend is not gonna last long term... A decade at most.


We're way further than 3 orders of magnitude away.

"Theoretically, room‑temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media."

That would put a piece of memory changing at 1gbps at 2.85 billions of a watt of power with current technology.

Past that, modern CPUs are still horrendously inefficient with respect to how much data they're shuffling around to perform computations, since they're optimised for getting maximum performance out of a single thread.

A chip operating near the Landauer limit and efficiently computing useful results on top of that makes our current CPUs look like abacuses.


From doi:10.1038/nature10872: "From a technological perspective, energy dissipation per logic operation in present-day silicon-based digital circuits is about a factor of 1000 greater than the ultimate Landauer limit".


I ran the numbers, and while you're way closer than I was, it still seems to me it's way more than 1000.

1 billion bits at 2.85 trillions of watt

-> 1 billion transistor CPU, uses 2.85 trillionths of a watt at 1 HZ

-> 1 billion transistor CPU, uses 2.85 thousandths of a watt at 1ghz

-> 1 billion transistor CPU, uses ~1/100 watt at 3.5ghz

So that puts us at a factor 10,000 away (billion transistor, 3.5ghz CPUs use ~100 watts at peak), assuming that all transistors are on. However, they're not nearly always on. Most are cache, most are logic that's not used, etc. Call it 5 orders of magnitude, or 100,000x.

The paper you're referencing seems to ignore leakage current etc, but I don't think for practical analysis of the field you'd want to do that, since obviously other factors are going to be improved as well.

Add in how inefficient our modern computation is, in terms of how many bits we're flipping around for the result we're computing, and I stand by my comment; theoretically optimum computation devices are going to make our current CPUs look like abacuses. Though we'll have to break out of our x86 and even CPU model to get there.


True, but right now, programmers often aren't even using what we have available to us very efficiently. There are a few- a select few- who are making supercomputing systems more efficient, but by and large we have a long, long way to go.

Furthermore, the things that have 'changed everything' in the last few years are all user interface related. As we get better at human speech recognition and creation, that will be when things truly are wholly changed.

However, one thing I do have to ask those who know better than me: Does the Landauer Limit imply that we won't be able to optimize any further, at a certain point, in terms of hardware capability? And would this include optimizing for size? In other words, if we hit that limit with something the size of Watson, could we reasonably still expect it to reach the size of a cell phone, half a century after that?


From the wiki entry: "Theoretically, room‑temperature computer memory operating at the Landauer limit could be changed at a rate of one billion bits per second with only 2.85 trillionths of a watt of power being expended in the memory media."

Do we really have chips that update a billion bits per second for only two billionths of a watt?

Also, reversible computing would take things much further without violating Landauer.


You are right, reversible and quantum computing could take things further.


Even just a decade of this exponential trend might very well "change everything" as the title claims.


The question is, what is the economic effect of a given number of computations? I believe it is decreasing at least as fast as energy efficiency is increasing. I don't have any data to back it up though.

But if I'm right and "economic effect" translates roughly into broadly perceived wealth (of which I'm not sure), then energy efficiency isn't rising in any meaningful way.


> The question is, what is the economic effect of a given number of computations?

Which computations? Computations to render another monster on the screen of someone playing Doom 3? Near-zero. Computations run on a quant's computer to hint at a market trend? Potentially massive.

It isn't a question you can answer in general. I doubt it's even a question you can approximate an answer to in any sane fashion.


I don't think it's that difficult and not even that different between industries because it's all relative to what is possible. Both 3D shooter games and quant traders always needed to be in about the same place on the technology curve in order to succeed economically.

You could compare the hardware requirements of today's best selling 3D shooter games to the hardware requirements of the original Doom. One hours worth of game play uses x CPU instructions, y kWh of energy and earns the maker of the game z dollars. You could do something similar to compare today's best quant trading desks to those of 10 or 15 years ago.

Actually you could just as well ignore CPU instructions entirely and just compute dollars earned per kWh for a few computing heavy industries. Obviously that's very crude and there are lots of missing variables but I don't think it's meaningless.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: