

A Future Of Pipes - gwern
http://www.overcomingbias.com/2013/11/the-bright-future-of-pipes.html

======
cscheid
I recently scored a copy of Maxwell's Demon, a book containing a collection of
(among others) Landauer and Bennett's papers. As I was skimming them, I
couldn't help but think that if there is a correspondence between heat and
information erasure, can that be used to move "heat" in a fundamentally
different way?

The idea would be you do some computation at point A, the related information
erasure at point B, and the heat associated with it get "transported" faster
than it would be by moving pipes around. The hope is that sending the bits
along something like an optical network would be faster than laminar fluid
flow.

I'm sure HN's more knowledgeable folks will tell me exactly where I'm wrong,
but it seemed like an intriguing possibility.

~~~
tree_of_item
There's a comment on the post by Tim Tyler talking about "digital heat", which
seems to be close to your idea. He's written more about it here: [http://cell-
auto.com/reversible/](http://cell-auto.com/reversible/)

------
beefman
Hanson needs to brush up on his thermodynamics. Reversible computers need not
generate any heat. But they do need to grow larger as they are operated (their
size scales with the number of operations performed). This means SPACE = TIME,
which is a pretty sad state of affairs and probably the reason why all
practical computers and known lifeforms are irreversible machines.

~~~
pyre
> all practical [...] known lifeforms are irreversible machines.

Not necessarily true. There is that 'immortal' jellyfish that basically
reverts back to a polyp and then grows again.

[http://en.wikipedia.org/wiki/Turritopsis_dohrnii](http://en.wikipedia.org/wiki/Turritopsis_dohrnii)

~~~
beefman
That's an interesting animal, but its unusual life cycle in no way negates the
fact that its metabolism is irreversible (both at the cellular and ensemble
level).

------
mtdewcmu
It's not a good bet that data centers will need to build out at a constant
rapid pace into the future to keep pace with demand. A lot of the work in
servers in recent years has been about improving utilization of available
resources, and I expect that there is still a lot of headroom left for
improvement. It's probably safe to say that across all datacenters, the vast
majority of heat is generated while servers are mostly idle and have too
little to do.

I don't think that it's safe to anticipate reaching some kind of fundamental
limit on power efficiency, either. Nobody has come up with a viable
replacement for the silicon transistor, despite decades of trying, and current
CPU technology has practically stopped generating any improvements for the
past several years. I think the important quantity to measure is the energy
that goes into powering up the cpu for one clock cycle, because clock cycles
are what drive computation forward and their individual energy usage basically
just add together. For decades after the invention of the transistor, this
energy dropped rapidly due to shrinking the wires. However, the wires reached
a point where their capacitance began to fall off with decreasing size, and
there have been no significant energy improvements since. Solving that problem
requires either a revolutionary discovery about the physics of conductivity, a
completely novel mode of doing computation that isn't based on electronics
(photons?), or a fundamentally novel principle of organizing a cpu that
somehow avoids being clock-bound (I heard about an effort over ten years ago,
but I have not heard of it since). Any of those improvements could have
surfaced at any time in the past few decades and been eagerly pursued, but
none did, so the odds don't favor the miraculous appearance of one of them in
the foreseeable future.

~~~
sigstoat
> a brand new model of computation that somehow avoids being clock-bound

like asynchronous circuits?

CPUs based on them have been around, and more can certainly be done. they're
just a lot more complicated.

~~~
mtdewcmu
The fact that the idea has been around a while and not gone anywhere is a
pretty good indication that it's a technological dead end.

~~~
kryptiskt
It says nothing. OLEDs are 50 years old now, capacitive touch likewise.
Electric cars were around 120 years ago.

~~~
mtdewcmu
What you're saying is that it's impossible to accurately predict the future. I
agree. But you can make informed judgments about what's likely.

------
femto
Is it "SPACE = TIME", or is the limitation actually set by the surface area of
the volume enclosing the machine? In space-time coding for communications, the
limitation on achievable spectral efficiency is related to the surface area of
the volume enclosing the antenna array, in units of wavelength. It would seem
sensible if this was reflected in a fundamental property of information.

Maybe a fundamental limit of the universe is that the minimum surface area of
a volume containing a subatomic structure is related to the information
required to define that structure? My maths is a bit fuzzy, but isn't there a
fundamental relationship between a volume its boundary, involving a Fourier
Transform? That could then mean that a limitation on surface area is
equivalent to a limitation of the (4-dimensional?) volume occupied by some
other quantity?

The above is speculation on my part, but is there a HN reader our there who
can say whether such thoughts are garbage, or whether they have some basis?

~~~
falcor84
I think that the relationship you're referring to is the holographic principle
-
[https://en.wikipedia.org/wiki/Holographic_principle](https://en.wikipedia.org/wiki/Holographic_principle)

My physics abilities are far from what I'd want them to be, but it does sound
to me that there may be some insight in what you're saying.

------
michaelwww
> _Thus in future cities crammed with computer hardware, roughly half of the
> volume is likely to be taken up by pipes that move cooling fluids in and
> out_

A lot of this post reads like nonsense.

~~~
mtdewcmu
Presumably those would be domed cities like you see in futuristic predictions
from the 1960s. That's why you can measure their volume and predict the
percentage that will be dedicated to pipes to carry waste heat from computers.

------
csense
The author assumes the demand for CPU cycles is elastic in the long term (i.e.
as prices of cycles drop, people will continue to use more and more of them).

General-purpose desktops and laptops today often don't come close to using all
of the CPU horsepower available to them, and many server applications are
limited by memory, network bandwidth, and storage.

It seems conceivable to me that at some point, before reaching thermodynamic
limits, CPU cycles will stop being scarce for many application spaces, and
future computing investments will mainly be in networking, memory, storage and
software.

------
mrcactu5
> The future of computing, after about 2035, is adiabatic reservable hardware.
> When such hardware runs at a cost-minimizing speed, half of the total budget
> is spent on computer hardware, and the other half is spent on energy and
> cooling for that hardware. Thus after 2035 or so, about as much will be
> spent on computer hardware and a physical space to place it as will be spent
> on hardware and space for systems to generate and transport energy into the
> computers, and to absorb and transport heat away from those computers.

The computer almost sounds alive...

~~~
lisper
Not almost. It is alive. The only fundamental difference between computers and
"life as we know it" is that the computer's genetics are encoded on magnetic
media rather than DNA.

------
walshemj
Didn't Seymour Cray say that HPC was mostly about plumbing - to the manage the
heat

mm maybe I ought to dust off those old steam tables eh.

------
tmallen
There's a long way to go before building permanent structures on the Moon is a
possibility (meteoroid detection/destruction, understanding seismic activity,
etc.), but its far side would be a great place for building data centers.

~~~
ndonnellan
Space is a terrible heat transfer fluid.

~~~
brudgers
But a top notch heat sink.

~~~
gcr
Not really. (Disclaimer: I'm not an expert at all, much of what I say is
wrong)

Consider your CPU's heat sink. The CPU gets hot, but it can transfer that heat
to your metallic heat sink (thermal conduction?), which in turn can dissipate
that heat to its surrounding air (thermal convection?).

If you're in space, the only way to get rid of that heat is by radiating it as
light (radiation; the same way a metal glows white hot) but that's much less
efficient. You can't transfer it to another mass because there isn't other
mass in a vaccuum.

~~~
sigstoat
i think there is a bit of a terminology issue.

to give brudgers as much credit as possible, space is a pretty good heat sink,
in that it can absorb all of the heat you can ever produce, without changing
temperature. (and since heat transfer is proportional to temperature
difference, your transfer rate will never drop because of it.)

but it is terrible for heat transfer.

the metal widgets we stick onto our CPUs aren't really heat sinks, they're for
reducing the thermal resistance between the CPU body and the atmosphere, which
is the "final" destination for the heat.

