
Are “reversible” computers more energy efficient, faster? (2004) - jonbaer
https://www.eetimes.com/are-reversible-computers-more-energy-efficient-faster/
======
ahelwer
If anyone is interested in example of a reversible operation, the simplest one
is logical negation of a single bit. The second-simplest is CNOT, or
controlled-not, where one bit is the "control" bit and another bit is the
"target" bit. If the control bit is 1, the target bit is flipped. If the
control bit is 0, the target bit is unchanged. The control bit is always left
unchanged.

A very important reversible gate is the CCNOT gate (also called a Toffoli
gate) which has two control bits that only flip the target bit if both are 1.
This is a universal gate: any reversible logical operation can be constructed
from a series of Toffoli gates.

The article refers to reversible computers as "quantum computing's practical
cousin" because all operations on a quantum computer are reversible (except
for measurement). You may recall the recent pop science headline "scientists
reverse time using a quantum computer"[1], which was a hideous mangling of
this concept.

[1] [https://phys.org/news/2019-03-physicists-reverse-
quantum.htm...](https://phys.org/news/2019-03-physicists-reverse-quantum.html)

~~~
JadeNB
> The second-simplest is CNOT, or controlled-not, where one bit is the
> "control" bit and another bit is the "target" bit. If the control bit is 1,
> the target bit is flipped. If the control bit is 0, the target bit is
> unchanged. The control bit is always left unchanged.

Isn't this just XOR?

~~~
im3w1l
No because it takes two inputs and gives two outputs. One input is passed
through unchanged, and the other output is indeed xor.

~~~
saagarjha
And hence it is reversible, since you can get the inputs back from that
output!

~~~
sukilot
In normal human arithmetic, it's equivalent to saying something like "7+5= 12"
and remembering the "+5" so you can undo it later.

------
sgdpk
The principle the article is implicitly talking about is Landauer's principle
[1]. It states that if information is lost in a system, it must be as heat
dissipation.

[1]
[https://en.wikipedia.org/wiki/Landauer%27s_principle](https://en.wikipedia.org/wiki/Landauer%27s_principle)

~~~
peter_d_sherman
This is interesting!

The idea that in Physics, Heat = Information, that is, an identity or
potential identity.

Now, i'm currently 50/50 on that idea, that is, that Heat = Information (and
vice versa).

I'd love to see something which conclusively proves this, with say, the rigor
of a mathematical or logical proof.

Also, if Heat = Information, then we need some way of accounting for what
happens when a system cools, that is, how does the information (again, without
being lost) propagate into the environment, such that when an object cools, no
information is lost?

As I said, I'm 50/50 on this idea... I'd like to see a really good proof. The
best proof would be short, logical, and highly consistent... Prove to me that
no information is lost, or are we just accepting that because heat physically
difuses and no heat is lost, that no information is as well?

That would be both a simple and logical explanation, but then we'd have to
search the entire universe for no physical anomaly (to be sure that we're
absolutely correct), that is, there's no exceptional case where heat in a
system was lost or gained without exchange with an external system... which is
quite the search, let me tell you!

See, if we think that heat is never lost in the universe, we got that idea
from somewhere... it may have been given to us as one of the first axioms
given to us in an elementary Physics class... but where does it come from that
brought it there, and why do we never question it?

Help me to understand...

~~~
peter_d_sherman
Addendum:

OK, here's my problem... Soil temperature. That is, when you dig more than a
few feet into the soil, more than a few feet into the ground -- what you'll
find is consistent ground temperature, no matter what part of the world you're
in, no matter how hot or how cold the above climate is.

See, if there is conservation of heat, and the center of our planet is as hot
as the sun -- then why doesn't/hasn't this heat diffused out across the entire
interior of the planet?

If you heat soil that's more than a few feet down up, it's going to cool, and
if you cool it, it's going to heat back up.

Sure, you could argue that this heat is moved and diffused into the adjacent
ground/soil, but are we really sure that there isn't some other kind of
phenomena going on there?

Objects get very cool in outer space... near zero, yet they get hot, extremely
hot, when the suns rays, unshielded by the earth's atmosphere, touch them.

How do objects cool down in space, if by definition, space is a vacuum, and
vacuum = thermal insulation?

In other words, I have a Thermos(tm), with a vaccuum chamber inside of it on
one hand, and the vaccuum of outer space on the other.

The Thermos keeps my cold beverages cold, and my hot beverages hot. It does
this by preventing thermal dissipation. It does this by putting a vacuum
between the interior and exterior containers.

In outer space, on the other hand, if sunlight is blocked for whatever reason,
temperatures of objects get very, very cold. Where does the heat (aka,
information) go, if space is a vacuum?

You see, the paradoxes with respect to vacuums, and thus heat transfer in
vacuums, and thus Heat = Information abound...

Perhaps they would be solved if heat could travel through the vacuum (radiant
energy?) in an effect much like Quantum Tunneling -- just the outer space /
heat -- version of that?

I'm not an expert in any of these areas... Can anyone help me out?

~~~
dr_zoidberg
Heat is radiated as infrared light in a vacuum, and it's an extremely slow
process compared to convection and conduction. Read about black body
radiation[0].

[0] [https://en.wikipedia.org/wiki/Black-
body_radiation](https://en.wikipedia.org/wiki/Black-body_radiation)

~~~
peter_d_sherman
Black body radiation is interesting and does make sense, but then it brings up
another question, which is:

"If there is black body radiation everywhere in space -- then why doesn't
space itself heat up until it's eventually the temperature of the Sun?"

Maybe objects in space, such as planets, act as "heat drains" for the rest of
space... that is, all of the heat (information) in space eventually winds up
at a physical planet, or other object in space...

If this is true, then this leads to an interesting identity, which is
basically that every element, that is, everything on the periodic table from
hydrogen on downward, is not just an element, not just matter, but also has an
identity in terms of heat/information...

In other words, every single element (and by extension substance) is what it
is, but could also be thought about simultaneously as heat, and simultaneously
as information.

Which brings up endothermic/exothermic chemical reactions.

That is, chemical reactions between two or more elements (for example,
hydrogen and oxygen combining to produce water) where heat is produced
(combining them via fire), or where heat is consumed (note that the separation
of water into hydrogen and oxygen via electrolysis, as far as I know, is
neither endothermic or exothermic -- so where did the information go? Into the
electricity? It's possible, we can't rule out that possibility...)

And then the next question... Do chemical elements combined as a compound
(such as water) then store the information as heat from the reaction somehow,
or does it somehow dissipate, does that information somehow come back when
they are separated?

I don't know the answer to any of these questions; I'm just thinking aloud,
but my intutive mind says that _there may be something to discover there_...

~~~
goldenkey
Yeah information <=> energy. Change of state is equivalent mathematically to
movement in a "dimension." Action in physics is energy times time. Really just
a way to measure the total number of absolute changes that have occurred. All
energy causes change, it is all linked to entropy and information, it is
likely zero in total... [https://en.wikipedia.org/wiki/Zero-
energy_universe](https://en.wikipedia.org/wiki/Zero-energy_universe)

~~~
peter_d_sherman
I like everything that you've said, but of particular interest is this:

 _Change of state is equivalent mathematically to movement in a "dimension."_

I'd love to discover more information about that idea!

URLs and/or book recommendations about that topic would be most appreciated!

~~~
goldenkey
All the quantum numbers that define a particle can be looked at as values
within a dimension. It's possible that some are redundant in the standard
model, or not the greatest basis.

The simplest states that exist in the largest dimensions we know of are the
position/spacetime states. String theory posits that these little tiny looped
strings of matter, exist in dimensions that are on the verge of a planck
length. They just loop around. If you went from where you are, to where I am,
you'll have looped around those tiny string dimensions a crap ton of times.

~~~
peter_d_sherman
Utterly fascinating!

Write a blog, please!

I'd read it...

~~~
goldenkey
[https://churchofthought.org](https://churchofthought.org) :-)

------
sandworm101
This isn't the "reversible computing" I covered at school. To me, a reversible
computer was one that could absorb energy to perform a calculation that could
be recorded. Then the machine would reverse, releasing the energy as it
returned to a ground state. This would effectively mean calculations would
happen without energy consumption. The whole thing bordered on perpetual
motion, calculations happening without energy loss.

Electronic computers would not be good at this. They churn up too much heat.
The only practical path would be something mechanical, powered by gravity...
or a quantum object bouncing through a tiny rat's maze to solve a problem
before returning to the ground state.

~~~
Whirl
My understanding is that it doesn’t have to absorb energy to perform
calculations. The reversibility dependent on keeping track of all of the
correlations in the system.

Michael Frank has a decent lecture on the subject:
[https://youtu.be/JFRx6cvzd3U](https://youtu.be/JFRx6cvzd3U)

------
corysama
Interestingly, programs for quantum computers must be reversible.

[https://physics.stackexchange.com/questions/392414/again-
why...](https://physics.stackexchange.com/questions/392414/again-why-do-
quantum-computations-need-to-be-reversible)

~~~
Strilanc
Sort of. Measurement is not reversible, and often the most efficient method
for uncomputing an intermediate value in a quantum computation is to measure
it in the frequency domain and then perform a cheap fixup depending on what
you measured [1]:

> _Measurement based uncomputation intrinsically generates entropy (due to the
> X basis measurements), but it uses significantly fewer operations. So,
> ironically, we will optimize the energy usage of quantum computations not by
> staying pure to our reversible Landauer-less roots but instead by using an
> irreversible form of uncomputation that generates entropy._

1: [https://algassert.com/post/1905](https://algassert.com/post/1905)

------
m463
I've always wondered about this since I read about Richard Feynman
investigating reversible computing.

I think it was too early for people to value or adopt the idea, since moore's
law was on trajectory and nothing was slowing it down.

It's interesting (and good) to see that maybe it will be what helps take us
further up the curve.

------
jonbaer
I think Michael P Frank is one of the more knowledgable people working on the
subject, his Stanford talk is extremely informative ...
[https://www.youtube.com/watch?v=IQZ_bQbxSXk](https://www.youtube.com/watch?v=IQZ_bQbxSXk)

------
saurik
So what I never got about how this works is that while it makes sense to me
that there would be less energy usage (due to reduced heat dissipation) from
avoiding any information loss through a calculation, at the end of the
calculation the "answer" you get back is going to be much larger, particularly
for increasingly large and compounded workloads (as it has to include all of
the information required to reverse all of the states)... clearly you don't
want all of that information, so _someone_ has to destroy all of that
information and incur the heat cost of eating all that electricity used to
store it, right? If this isn't quite right, and at the end I'm allowed to just
pool all those circuits together to get some energy to run the next
computation (which seems to be implied by some of these descriptions), why
isn't that the case for other destructive computations? Like, every time I
have an AND gate, I take the waste electricity from 3 of the 4 cases and store
it to power subsequent NOT gates (which, half the time, need electricity). <\-
FWIW, I'd be willing to believe this is equivalent to "reversible computing",
as that waste electricity is very similar to the information I need to do the
reversal--it would require another wire as input as a pure source of power and
a separate answer wire to be directly analogous, but the principle is the
same: I'm just combining some of the steps, both figuratively and literally--
it is just that this is never how it is described in any of the articles I
ever find on the subject, which seem to want to maintain the reversibility for
large sequences of gates and even entire computations as opposed to merely as
a hyper-optimization of individual gates in what is otherwise pretty much a
classic architecture. It just seems like if I'm going to do some kind of
energy storage (I've seen analogies in these articles talking about springs
and flywheels) I can just do that to all my waste electricity and then tap it
for all/any of my later electricity needs, and then all I have to do is bound
the maximum amount of power I need for a particular circuit (such as a single
instruction on a von Neumann CPU) that is accidentally entirely deficit (due
to the input being "just right" to cause like, maximal numbers of on circuits
being passed through NOT gates and internal buffers) so I know how much I need
to to pre-charge the capacitor before I go through each cycle; and then I only
have to truly dissipate electricity if the input for the calculation is itself
expressed as a large amount of energy and my battery is already full, and I
can do all of this without needing to actually care about how to do "logical
operations" using "reversible operators" (but again, to be clear: this is a
question from me about what I'm not understanding and what isn't being
expressed in these articles I keep reading, not me asserting that this is how
it should work and that everyone is dumb: this is my mental model, and I'd
love it if anywhere were willing to take the time to explain to me why _I 'm_
dumb ;P).

~~~
traverseda
>clearly you don't want all of that information, so someone has to destroy all
of that information and incur the heat cost of eating all that electricity
used to store it, right?

My understanding is that you _don 't_ destroy the information, you _copy_ the
relevant parts and then reverse the original computation. More specifically
you only copy the parts of the computation that are relevant to you, you don't
copy the entire (much larger) intermediate state.

So you are only ever doing truly destructive operations on a small buffer at
the beginning and end of each each "clock cycle" instead of for every logic
operation.

Alternatively a fully-reversible circuit could look more like a mechanical
clock, you can return it to any state just by moving the key
backwards/forwards far enough. It gets more complicated when you start
including any kind of IO, but in theory you'd still only be
copying/overwriting bits every so often, with a normal circuit being cyclical.

~~~
api
I just got a mental image of a network of springs or something snapping back
into place after delivering something.

~~~
abecedarius
Pretty much:
[http://www.zyvex.com/nanotech/mechano.html](http://www.zyvex.com/nanotech/mechano.html)
(Yes, the same Merkle as in Merkle trees.)

------
peter_d_sherman
A "reversible" virtual machine (software) can be created without too much
difficulty -- at the expense of using a lot of memory, and only being
reversible for as many steps as are state changes that are stored in that
memory.

Simple example, an assembly language instruction changes a memory address to a
new value, mov [eax], $FFFFFFFF where eax contains the address where the
$FFFFFFFF should go.

But now your virtual machine intercepts that instruction before it executes.

Before it stores the value to the address, it saves a copy of what the address
was BEFORE the mov/store -- in OTHER MEMORY (as a stack data structure, since
this could occur multiple times for individual memory locations...).

If/when eax changed, that would also have to be stored in OTHER MEMORY...

To reverse it then, we restore eax if it changed, then look at the memory
location referenced, and restore the previous value (from the stack in other
memory that we buffered it in).

With this approach a virtual machine can be created that can go back (rewind)
as many instructions as you have memory for.

A machine like this also must buffer the state of things that a program
interacts with, such as files and sockets, and be able to restore previous
states of those in steps.

It's a bit complex... but it can be done in software, and software alone...
but again, you couldn't reverse a program that had run infinitely long, you
could only reverse for as many steps as fit into the other memory area that
you buffered previous states in...

Also, drawback: It would be much slower than a regular virtual machine...

But it could be done...

~~~
OkGoDoIt
But all of those steps seem like they would take additional computing power,
and therefore generate more heat. I fail to see how what you describe uses
less power. It seems somewhat akin to emulating a low power processor on a
very high powered system, the emulation itself ends up using more processor
cycles and power than the original hardware would have.

