Hacker News new | comments | show | ask | jobs | submit login
Von Neumann-Landauer limit (wikipedia.org)
183 points by dedalus on Sept 30, 2017 | hide | past | web | favorite | 36 comments

The usual extropian retort to this limit has been "reversible computing," but as far as I can tell there has been little work on reversible computing hardware -- far less than quantum computing. It looks like the University of Florida made some hardware in the early 2000s. Can anyone offer insights about why hardware research is so scarce here? Are there reasons to think it's a dead end prior to fabricating anything?


EDIT: searching Google Scholar, it looks like only ~50 papers/books mention reversible computing in the past year, compared to 1000+ for quantum computing.

Just my guess, but the practical motivation is probably not there -- we are so far from the levels of efficiency at which the Landauer principle is noticeable.

Oh it's worth noting that quantum computing research is reversible computing research. Quantum operations are unitary operators which necessarily have inverses. Famous quantum computing results like the "no cloning" theorem are basically just consequences of the reversibility of QC.

I suspect any non-trivial reversible computer will be a quantum computer. Quantum gates can perform almost any reversible operation and losing information (by not being reversible) might be physically equivalent to decoherence.

Reversible computation is inherently slow computation and the main advantage is it lets you get below this limit but current computers are still many (dozens?) of orders of magnitude off from reaching that limit. Until then, you'd be better off just improving efficiency of non-reversible computers.

That wiki page says "Although the Landauer limit was millions of times below the energy consumption of computers in the 2000s and thousands of times less in the 2010s,[3] proponents of reversible computing argue that this can be attributed largely to architectural overheads which effectively magnify the impact of Landauer's limit in practical circuit designs, so that it may prove difficult for practical technology to progress very far beyond current levels of energy efficiency if reversible computing principles are not used.[4]"

On a skim it looks like [4]'s argument is that people have been criticizing the overhead of fully reversible circuits, but only 'conditionally reversible' circuits are needed for the power benefits.

Based on this statement there is also another possible way: remove the architectural overheads that magnify the impact of Launder's limit. So probably he is right saying that there are other improvements that are easier to accomplish that we can do before caring about Launder's limit.

Wow have we improved that much in efficiency? I learned about this from the Feynman lectures on computing which are perhaps a little out of date.

Adding: I looked it up and Feynman quotes 10^8 times as much energy dissipated per step as the Van Neumann-Landauer limit when using the transistors of his era (1984-86 is when he taught the course).

I remember talking to someone who had worked on reversible computing about 13 or 14 years ago, and he had said that (i) it does work (presumably that it is indeed more energetically efficient), but (ii) the state of computing was such that it may be a couple decades before it can make a real impact (). Then he added, "someone else can have that fun." Around that time he switched to working on other things. () Unfortunately I can no longer recall exactly what he meant.

To make a circuit reversible generally requires more gates, because you need a place to store the information that you normally might have gotten rid of (say in an AND gate you simply store one of the variables). Practically this would make chips much slower if you assume everything else in fabrication stays the same (e.g., die size). Not a whole lot of benefits it seems, except for energy consumption, but that's the consumer's problem.

Reversible computing still requires energy (or rather neg-entropy). It's used to perform error correction. Otherwise the computer won't last very long.

It’s also used to offload bits. Real computers don’t have infinite tape.

I picked up The Information by James Gleick when I was visiting the Living Computers Museum + Labs in Seattle[1]. Really interesting book... talks about the development of language, the birth of computing, the concept of randomness, the physics of information, etc. A fascinating and comprehensive overview of the topic.

[1] This place is awesome and you should check it out if you're in the area: http://www.livingcomputers.org

I remember reading Feynman's thought experiment and him deciding that it doesn't require any energy to flip a bit.[1]

How does that square with Landauer's limit?

[1] Apologies for the lack of reference. I'll try and find it.

Basically, it costs free energy to forget a bit. If you flip a bit, that doesn't mean you forget it, as long as you remember you'll have to flip it back (or reinterpret it).

("Basically" because there are whole books on the subtleties of physics and information. I haven't mastered them.)

This is the correct answer. If you don't overwrite a bit, but instead move the old value to an auxiliary bit known to be zero, that can be theoretically done for free.

From what I understand flipping a bit isn't necessarily subject to this bound. However setting it to either 1 or 0 is.

Of course if you can flip it without expending energy then reading it and flipping it if it's 1 necessarily requires the Landauer limit's worth of energy.

I’m not sure how it squares with physics. I’d recommend finding that reference.

Well, perhaps it's only a theoretical minimum.

It agrees with kT ln 2 when T=0 right?

T is never 0.

Yeah I know :-)

But OK here's a reference of sorts. Not by Feynman but by Bennet and Landauer. https://www.scientificamerican.com/article/the-fundamental-p...

"there is no minimum amount of energy that must be expended in order to run a Brownian clockwork Turing machine."

Also interesting but I didn't find the exact thing I was looking for:

Simulating Physics with Computers Richard P. Feynman https://people.eecs.berkeley.edu/~christos/classics/Feynman....

Richard Feynman and Computation https://cds.cern.ch/record/411350/files/p101.pdf

Well for example, T cannot be less than the temperature of the cosmic microwave background. You can make regions with lower T, but only by pumping heat out in some way, which is more energy to do.

Being on mobile and not able to explore in depth, that quote sounds like a variant of Maxwell’s demon. It is correct to say that the Landauer limit is not due to a single physical law that must hold true, but rather a lack of knowledge about the state of the universe and the fact that acquiring that knowledge to do a “free” bitflip requires at least equivalent energy expenditure as that bitflip. TANSTAAFL.

I believe there is some breakthrough yet to be found in the intersection of entropic gravity and computational limits.

I’d be curious to hear more about that idea but thank you for introducing me to this theory of entropic gravity.

Me too - any links or articles for the best material much appreciated

Having read some more about this entropic gravity theory it does have quite a few pretty strong criticisms, detailed on the the wiki page.

As interesting as it is to think about the limit, it's so pointless from a practical point of view since the cost of computation is dwarfed by the cost of data movement in modern machines.

As inspiring as thought provoking.

Yes. I imagine the last probe doing its last bit flip in the heat death of the universe. Then nothing can be computed anymore and all what will happen is the decay of the matter of the probe. So sad.

I get the feel that something like that was the plot of a short story i read ages ago.

Possibly qntm.

Recently mentioned in Venkat Rao's Breaking Smart newsletter.

So you're telling me ignorance can combat global warming?

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact