
Physical Limits to Computation (2000) [pdf] - ignoramous
http://arxiv.org/pdf/quant-ph/9908043v3.pdf
======
Sniffnoy
A note -- if you're linking to arXiv, it's better to link to the abstract
([http://arxiv.org/abs/quant-ph/9908043](http://arxiv.org/abs/quant-
ph/9908043), or [http://arxiv.org/abs/quant-
ph/9908043v3](http://arxiv.org/abs/quant-ph/9908043v3) for this particular
version) rather than directly to the PDF. From the abstract, one can easily
click through to the PDF; not so the reverse. And the abstract allows one to
do things like see different versions of the paper, search for other things by
the same authors, etc. Thank you!

~~~
ignoramous
I thought about it at first, then decided against it. I wouldn't mind a
moderator change the link to point to the abstract, if that's the norm.

If anyone wants another overview other than the abstract, pls refer:
[http://en.wikipedia.org/wiki/Limits_to_computation](http://en.wikipedia.org/wiki/Limits_to_computation)

~~~
p1esk
Why did you decide against it?

~~~
ignoramous
Primarily because I come across a lot of non-html content like presentations
that don't link to the abstract, but to the slides themselves.

For instance,

hn-link:
[https://news.ycombinator.com/item?id=9631580](https://news.ycombinator.com/item?id=9631580)

abstract: [http://velocityconf.com/devops-web-
performance-2015/public/s...](http://velocityconf.com/devops-web-
performance-2015/public/schedule/detail/42385)

------
lisper
This is a terrific paper, not just because of the titular subject matter, but
because it's a really great overview of physics in general, and how various
core theories (quantum mechanics, relativity, information theory,
thermodynamics, and even cosmology) relate to each other.

BTW, there's a simpler way to calculate a physical upper bound on computation
(though this method yields a bound that is not as tight): model every
elementary particle in the universe as a state machine that advances at the
Planck frequency (i.e. the inverse of the Planck time). The number of
elementary particles times the number of Planck times between the big bang and
the heat death of the universe puts an upper bound on the number of states
that such a computer could enumerate. The number turns out to be surprisingly
small, about 2^500 or so. Among other surprising consequences, this shows
that, if you had the ultimate data compression algorithm, any state computable
by this universe could be compressed down to ~500 bits. Note, however, that if
you could do that, you could compute Chaitin's omega and solve the halting
problem, so optimal compression is impossible. Look up Chaitin's theorem if
you want to dive into that rabbit hole.

Fun stuff.

~~~
VLM
"The number turns out to be surprisingly small"

Forgot to add in communication networks between the isolated I/O bound
"supercomputers". Given one set of massively parallel computers of ID number 0
to somethingbig then changing the wiring such that the peristaltic array is
reversed would result in a dramatically different output.

Its kind of like a hashing function, feeding in different data (by changing
the "wiring") results in different outputs.

That also leads to some interesting caching delay type problems if processor
42 halts till it gets data from processor 7 but due to cosmic inflation
they're out of each other's light cones, forever. So Chaitin's constant would
vary based on how fast the universe expands and what fraction lives outside of
any given part's light cone.

Fun stuff to do with the constant, assuming it actually exists (as opposed to
a mere concept) and can be calculated:

Given a NSA level of network spying as a resource, if you know 5% of raw
(presumably non-executable) data should halt when executed and 100% or 0% of
virus payloads halts (uh, yeah whatever) then periodic random sampling of
supposed non-executable highly compressed video data halts at a 5% rate its
probably legit and if it halts at a 100% rate then you can set off the alarm
bell that something funky is happening right now online, a worm is trying to
spread using buffer overflows in video codec data or something like that.
Virus writers could defend against this, if they knew the constant, by making
sure their virus payload halted exactly 5% of the time to mimic real raw
compressed data. Cool as this hollywood movie plot sounds it would probably
get simplified into CSI style "lets just enhance the pixels, ok?"

Also the SETI people could probably do something interesting with it, although
I'm not clear exactly what. IF it were calculated to be exactly 7/22 (LOL)
then an alien civilization repeatedly squirting 7/22 out to the world would
imply that civilization has aged beyond the "must be this tall to enter" bar
on the first contact ride, or they're just really big fans of the reciprocal
of pi. The analogy is if you really want to impress a pre-computer
civilization, give them the result of some ridiculous factoring problem that
would take Andorian-centuries or whatever to calculate by hand but can be
verified in minutes if you know what you're doing.

~~~
lisper
> Forgot to add in communication networks between the isolated I/O bound
> "supercomputers".

Nope. This model makes no assumptions about the actual computations being
performed at each particle. The particles could be communicating or not, it
doesn't matter. The total number of potentially physically computable states
is still the same.

------
harperlee
I would also recommend Richard Feynman's Lectures in Computation, that cover
some of these and are quite interesting!

------
ikeboy
See also NP-complete Problems and Physical Reality

[http://arxiv.org/abs/quant-ph/0502072](http://arxiv.org/abs/quant-ph/0502072)

------
hackerway
The limit to computer is also the limit to human brains.

------
Rainymood
>The question is, When?

Typo?

