
Limits of computation - diziet
https://en.wikipedia.org/wiki/Limits_of_computation
======
smeyer
>Landauer's principle defines a lower theoretical limit for energy
consumption: kT ln 2 joules consumed per irreversible state change, where k is
the Boltzmann constant and T is the operating temperature of the computer.[2]
Reversible computing is not subject to this lower bound. T cannot, even in
theory, be made lower than 3 kelvins, the approximate temperature of the
cosmic microwave background radiation, without spending more energy on cooling
than is saved in computation.

Well, technically, you just have to wait as the universe continues expanding
if you want the temperature of the CMB to drop.

~~~
jakeogh
Black Hole Farming:
[https://www.youtube.com/watch?v=Qam5BkXIEhQ](https://www.youtube.com/watch?v=Qam5BkXIEhQ)

------
dorgo
There are noncomputable functions. For example:
[https://en.wikipedia.org/wiki/Busy_beaver](https://en.wikipedia.org/wiki/Busy_beaver)

~~~
csl
If anyone is interested, I attempted to plot the first few Busy Beaver
iterations: [https://github.com/cslarsen/busy-
beaver](https://github.com/cslarsen/busy-beaver)

~~~
dorgo
>Busy beaver machines are, by definition, non-halting machines

Is this an error? I thought busy beaver must halt?

~~~
csl
Thanks, I see now that the whole section is very clumsily written.

IIRC, a Busy Beaver may halt or not. You only care about those that _do_ halt,
though, and the _champion_ is the one with the most number of 1s on the tape.
The whole problem is determining if a given machine will halt or not. Which we
know, by the halting theorem, is possible to prove on a case-by-case basis,
but not in the general case (i.e., you cannot create an algorithm to determine
it).

So you resort to heuristics. E.g., if a machine never has a "go to the halt
state" in its transition table, you know that it can't ever halt. But as the
machines grow, you'll have to cover an infinite amount of such heuristics,
hence the incomputable nature of them.

------
pmontra
About reversible computation and the Landauer limit:
[http://www.nipslab.org/node/5097](http://www.nipslab.org/node/5097)

~~~
vilhelm_s
I think the key paragraph from the paper is this:

> We would like to emphasize that the output of our logic gate can be fed
> directly as an input to a second similar device (as shown below) and so on,
> to perform all the calculations desired. Clearly, our logic gate is a
> combinational device; thus, the removal of the inputs makes the system
> revert to the initial state S0 (regardless of whether the final state was S0
> or S1). If we want to remember the final state, we need to couple this
> device to a sequential device where a Landauer reset might be required and a
> minimum dissipation of kBT log 2 is needed.

In other words, the Laundauer limit applies to state machines, but not to
combinatorial logic blocks. I guess in retrospect this should not be
surprising: the derivation of the principle considers a computing machine
which is in thermal equilibrium with its environment, so implicitly we model
the machine as having some memory unit which has time to thermalize after each
operation.

------
rorosaurus
Fascinating analysis! The section "Building devices that approach physical
limits" reminds me of a rock Kurzweil mentions in "The Singularity is Near"
[1]. Although it's not as a cosmic scale, it's a similarly interesting
exploration of the limits of computation!

>“How Smart Is a Rock? To appreciate the feasibility of computing with no
energy and no heat, consider the computation that takes place in an ordinary
rock. Although it may appear that nothing much is going on inside a rock, the
approximately 1025 (ten trillion trillion) atoms in a kilogram of matter are
actually extremely active. Despite the apparent solidity of the object, the
atoms are all in motion, sharing electrons back and forth, changing particle
spins, and generating rapidly moving electromagnetic fields. All of this
activity represents computation, even if not very meaningfully organized.
We’ve already shown that atoms can store information at a density of greater
than one bit per atom, such as in computing systems built from nuclear
magnetic-resonance devices. University of Oklahoma researchers stored 1,024
bits in the magnetic interactions of the protons of a single molecule
containing nineteen hydrogen atoms.51 Thus, the state of the rock at any one
moment represents at least 1027 bits of memory.”

[1] [https://www.goodreads.com/quotes/1284270-how-smart-is-a-
rock...](https://www.goodreads.com/quotes/1284270-how-smart-is-a-rock-to-
appreciate-the-feasibility-of)

~~~
philipkglass
That seems to deeply confuse "a thing that can be modeled by computer" with "a
computer." It's trying to commensurate the incommensurable. How much more
efficient is a rock than your phone when it comes to computing the sum of two
64 bit integers? How much more efficient when it comes to storing and
retrieving a megabit of JPEG data? I'd accept answers in the form of joules
per operation.

The very questions are faulty. A factory-stock rock is not a programable logic
_or_ memory device. If it computes at all, it only computes being-a-rock.

This rock analogy is the sort of sloppy thinking that takes someone from "it
takes at least 38 petaflops to simulate a conscious human brain" (currently
unknown, but not _obviously_ incorrect) to "the human brain is a petaflop-
class supercomputer" (WTF, no, category error).

~~~
the8472
It's basically saying that the universe is a special kind of computer that
computes state transitions according to the rules of physics, presumably at
the planck scale, which is far beyond what our simplified models of said rules
can manage.

This view of he universe seems to be fairly important and taken seriously
since black hole thermodynamics, gravity as entropic force and similar
concepts are derived from it.

~~~
philipkglass
The digital physics everything-is-computation view of the universe is also not
obviously wrong; it is taken seriously, as you say. But even if the universe
is computational there's presently no example of running a human-defined
instruction sequence at that fundamental level.

The quote I was reacting to emphasizes feasibility:

"To appreciate the feasibility of computing with no energy and no heat,
consider the computation that takes place in an ordinary rock."

There's precious little to suggest _feasible_ computing with a rock. (Or to
suggest other _feasible_ ways to run computations of interest to humans with
no input energy and no waste.) It's like reading "to appreciate the
feasibility of supplying civilization's electricity sustainably, consider the
energy released when two neutron stars collide." Both are strong contenders
for "among the least feasible engineering programs not yet proven outright
_impossible_."

~~~
rorosaurus
This thread might not be particularly relevant for you. The parent link is
discussing theoretical computational capacity at a cosmic level - which
"presently has no example of running a human-defined instruction sequence at
that level".

------
kangxitenant
[https://arxiv.org/abs/1301.4504](https://arxiv.org/abs/1301.4504) quantum
computation complexity prevents hawking radiation from being decoded before
the black hole evaporates!

------
prashnts
> A cold degenerate star could conceivably be used as a giant data storage
> device, [...] purposes.

The cited link to this [1] is a fascinating email thread/essay which they call
"Femtotech". I don't even understand most of the stuff, but it seems we're
far, far from it.

[1]
[https://web.archive.org/web/20041025030505/http://www.cs.usu...](https://web.archive.org/web/20041025030505/http://www.cs.usu.edu/~degaris/essays/femtotech.html)

~~~
projectorlochsa
I remember watching some documentaries and reading some articles, and remember
Hugo de Garis talking about femtotech. I thought he was a pioneer in the
technology, seems he's just a bystander.

He and Ben Goertzel seem to look as pioneers but never produce something
tangible.

