
The Future of Computing Depends on Making It Reversible (2017) - Darmody
https://spectrum.ieee.org/computing/hardware/the-future-of-computing-depends-on-making-it-reversible
======
philipkglass
_There’s not much time left to develop reversible machines, because progress
in conventional semiconductor technology could grind to a halt soon. And if it
does, the industry could stagnate, making forward progress that much more
difficult. So the time is indeed ripe now to pursue this technology, as it
will probably take at least a decade for reversible computers to become
practical._

I believe the opposite: companies should wait until performance improvements
in mainstream processors have really ground to a halt before investing
substantial resources in reversible computing. Otherwise it will be a
financial sinkhole like many prior attempts at exotic computing hardware that
couldn't make a sustainable profit.

The only path suggested in the article that could be pursued right now
involves superconducting devices, requiring cryogenic cooling that will never
work for phones, tablets, or laptops. A company would be trying to catch up
with a _tremendous_ head start enjoyed by CMOS plus conceding that the new
hardware, even fully realized, will never address markets as broad as CMOS.

------
jayd16
Maybe its just my distaste for the more whimsical topics in thermodynamics but
is this a round about way of saying any circuit without entropy loss is also
reversible?

As the article states, you can reverse intermittent computational values to
recover energy so it seems that reversibility isn't really a necessary feature
of computing as much as a feature of waste free circuits.

EDIT: Why the down votes? I'm asking a question here.

~~~
OneWordSoln
>> Why the down votes?

Let me offer a theory:

It seems that we need a 'Law of Anonymous Social Systems Entropy' (not to be
confused with the idea of 'Social Entropy') that states that every social
system (including social media platforms) tends, over time, to demonstrate
that groups of negative human beings will eventually -- by virtue of their
being less constrained by civility and less tolerant of foreign ideas or
groups -- infiltrate and negatively affect all human social systems that allow
anonymity. This is likely due to the fact that anonymous social capital (e.g.
this site's user reputation) is not real enough to activate normal human self-
censorship. As well, the disingenuous will be able to game the system to gain
the social capital necessary to negatively affect the system. Note that the
negative group need not be in the majority to do damage.

It has taken longer for HN to reflect this truth but it appears that a
significant percentage of this site's commentariat has finally become more and
more insular to ideas -- and even questions -- more borderline than the ideas
of those who hold its social capital (i.e. those with the ability to downvote
a post).

In theory, anonymity is great for allowing free-thinking people to express
their controversial views without worry for reprisal from other groups, but in
reality that gives cover to the people with purely destructive, anti-social,
anti-progressive and intolerant tendencies. And those attitudes and their
accompanying behaviors have a much greater negative system effect than those
of well-intentioned users.

The fact that our entire world is now connected by effectively anonymous
social media means that the above concept has crept into every aspect of our
lives, and has actually cost lives in places such as Myanmar.

(Note that I don't have the karma to up or down vote.)

~~~
TeMPOraL
That reminds me of Eliezer's model of evaporative cooling of beliefs - and, by
extension, communities. TL;DR: as trolls, or random chance, slightly lowers
the standard of discussion, the first people to get fed up and leave the
community are the ones with above-average quality contributions. When they
leave, the average quality goes down. The process repeats until the community
is garbage.

[https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporativ...](https://www.lesswrong.com/posts/ZQG9cwKbct2LtmL3p/evaporative-
cooling-of-group-beliefs)

------
Darmody
An interesting video about it:
[https://www.youtube.com/watch?v=rVmZTGeIwnc](https://www.youtube.com/watch?v=rVmZTGeIwnc)

~~~
teekert
Interesting but it doesn't really explain how a software trick can help us
break the Landauer limit...

I'd love for Isaac Arthur to tackle this, he uses the Landauer limit a lot to
express things like "How many human minds can run on the energy of a black
hole some 10^53 year into the future?"

~~~
LandR
His channel is really fantastic.

I love his civilizations at the end of time series.

~~~
teekert
I send Isaac an email, he responded that he would look into it.

------
teekert
So, after some reading and watching... Is this like using balls in space to do
computation? It takes no energy to lead them down paths using elastic
collisions but it does moving them actively and resetting them to their base
position. Then, resetting could also be done by a fully elastic collision
causing the ball to go back down its path.

But then, it still must take energy to move the switches that send the balls
in desired directions, right? But perhaps this is much less. But not 0.

I am not a physicist in case this post was not glaringly obvious.

~~~
MayeulC
One of the possibilities for adiabatic computing I was interested in some time
ago was to use an oscillating power supply. You still lose some energy due to
the interconnect resistance, but much less, since you're only sustaining the
oscillations (the charges don't get dissipated when returning to the ground,
they just move back).

Then, it's "only" a matter of deciding which part of the circuit will be
powered for the next oscillation, and the switching can be done when vds=0.

I haven't had the occasion to think about this as much as I wanted, though, so
the above explanation might be incorrect. Hopefully this gives you a better
idea of the concept (which there are many ways to approach).

So yeah, there are losses (mostly due to circuit resistance), but the
"switches" are capacitors, which are just "springs" in the electrical sense.
You can get back your energy if done in a clever manner, but we typically just
short them to discharge (switch) them :)

So, compare the energy required to compact and extend a spring if:

* You oscillate with it

* You compact it, fully release it, then recompact it.

This is a very apt analogy, I think, up to the mathematical level.

------
3dfxiter
Wonder if progress on reversible computing tech is being stymied by the NSA's
own ambitions in this area? They have an annoying habit of plucking promising
talent out of non-classified R&D environments that they feel would be useful
for maintaining the computational competitive edge that they need in order to
keep ahead of the rest of the world. They have done this for decades in areas
like VLSI design and cryptography.

------
markbnj
> A conventional computer is, essentially, an expensive electric heater that
> happens to perform a small amount of computation as a side effect.

Is this really a fair analogy? Isn't it precisely the opposite? The computer
performs computation and generates heat as a side effect?

~~~
Sharlin
Given how far we are from the Landauer limit, the comparison is apt. The
amount of negentropy gained by computation even in the best case is utterly
dwarfed by the increase of entropy in the form of waste heat.

~~~
jazzyjackson
I was having this conversation recently, trying to think of what about
computation produces waste heat - how many watts go to computation vs how many
watts are waste due to inefficiency.

But we couldn't put our finger on, where does this inefficiency come from? Is
is the silicon heating up due to resistance ? Or is there some analogy for
friction in switching the state of a flip flop , such that there is heat
generated by the state change ?

~~~
Sharlin
In our current hardware it is almost entirely electrical resistance. But even
in a theoretical 100% superconducting computer there’s a fundamental lower
bound to heat dissipated by every _bit-erasing_ operation; this is the
Landauer limit. Entropy increases when we forget information: we can’t reverse
the computation, giving time a direction (the so-called thermodynamical arrow
of time).

You can think of it as conservation of energy: if you have two electric
signals and put them through a gate with only one output, some of the total
energy must be dumped to balance the books. If a computation is completely
reversible—if we can always get our input back given the output, Landauer
limit does not apply, but of course other hardware inefficiencies may still be
present.

~~~
posterboy
Actually, cmos input stages are high impedance, i.e. high resistance, and
therefore have close to no current flowing between state changes. The output
stage should be very low impedance if driving loads, or very high impedance if
turned of. Indeed, switching between states, saturating the gates with enough
electrons is the problem that generates heat, when there is voltage and
resistance visible for the following node. At that scale, the small resistance
of the connections plays a role, too, sure.

------
_bxg1
Very cool. Still, it seems like other cooling technologies (and more energy-
efficient chip design) will remain lower-hanging than this approach for some
time. Lowering transistor size isn't the only way to increase speed.

------
abecedarius
To learn more, the _Feynman Lectures on Computation_ is one good older source.

------
jeromebaek
Visual studio 2017 has a "reverse time" feature in its debugger. It works
surprisingly well.

------
traverseda
Can you even build a reversible logic gate using flip-flops?

------
jancsika
So like infinite undo on the CPU level?

------
corysama
A bonus feature of reversible computing is that a lot of garbage collection
becomes a lot easier. Do a bunch of computation, copy off the results you need
long term, unwind the computation. No need to track the unwound values’
lifetimes.

~~~
_bxg1
I don't think this is analogous to immutable data structures.

~~~
corysama
That's not what I'm saying.

Execute instructions. Some of those instructions effectively allocate memory.
Reverse execution. Running backwards undoes the allocations.

~~~
klyrs
... at the cost of erasing the result, and taking time proportional to the
original computation? Clever optimization, but the net result seems suboptimal

~~~
corysama
The idea is that eventually we'll have to go reversible anyway. Computation
will have to run twice in order to run at more than 2X the clock speed of
running once. Getting easy garbage collection is a side effect, not a goal of
the optimization.

And, the result will have to be copied off before rewinding.

------
noswald18
This is a great idea but is contingent on advances in storage. Perhaps by
storing information in a more biological way (utilizing DNA, for example)
instead of silicon, is the way forward.

