
What is the theoretical limit of information density? - lpage
http://physics.stackexchange.com/questions/2281/maximum-theoretical-data-density
======
Steuard
The remarkable thing that this explanation doesn't really explore in depth is
that the maximum information (~entropy) contained in a region scales according
to its boundary _area_ rather than to its volume. That means that "maximum
information per unit volume" drops precipitously with size.[1]

Just for example, this allows you to calculate an _ultimate_ limit on Moore's
law of only ~800 years, for any possible computer functioning within the
bounds of the observable universe. As sketched below[1], the observable
universe can hold only about 10^123 bits of information. Processors currently
contain about 10^9 transistors (each of which has to be doing computations
with an independent bit of information to be useful), a factor of 10^114 less.
If Moore's law claims that this number doubles every 2 years, that means it
grows by a factor of 10^3 every 20. And 20x(114/3) is about 760 years. (A more
detailed calculation carried out in a paper by Krauss and Starkman at
[http://arxiv.org/abs/astro-ph/0404510](http://arxiv.org/abs/astro-ph/0404510)
came up with a limit of about 600 years.) That's almost frighteningly soon.

[1] As the link says, for a cubic centimeter (cc) of volume, the maximum
entropy is about 10^66 bits. But if you consider a cubic meter instead, you
find it can hold at most 10^70 bits, which comes out to 10^64 bits per cc! For
a cubic kilometer you get 10^76 bits, which is only 10^61 bits per cc. If the
solar system has radius ~10^13 m, it could hold at most 10^96 bits, or 10^51
bits/cc. And the whole observable universe (with radius ~5x10^26 m) could hold
at most 2.5x10^123 bits, or 10^38 bits/cc. That's remarkably less than the
direct one cc calculation! This behavior is exceedingly non-intuitive, at
least to me.

~~~
andrewflnr
So I take my cubic meter and divide into 100^3=10^6 1cm^3 segments, and store
10^66 bits in each one. This brings me to 10^72 bits in the 1m^3 volume. Does
something stop me from doing this? Or take it the other direction, store 10^66
bits in 10^6 little cubes and just stack them up. What gives?

~~~
Steuard
See, that's the fun part. If you try to pack in more information than given by
these limits, you inevitably form a black hole (and then you lose access to
those individual cm^3 regions, and your plan falls apart).

~~~
powertower
How so? Those 1cc-cubes all have the same exact density, and placing them
together does not affect/change the density of the formed object (the
1m-cube).

For what you're saying, they would have to be so massive and dense (in the
first place) that they would gravitate-in any matter placed near them (to then
form a black hole).

But does that even hold true when you run the numbers to see if that "max-
info" 1cc-cube is anywhere near the required mass (given it's volume of 1cc^3)
to form a black hole?

~~~
Steuard
One of the confusing issues here is that there is not a single, constant
density necessary for the formation of a black hole. Instead, the
Schwarzschild radius of a (potential) black hole is proportional to total
mass. That means that as you increase its mass, the actual radius of a sphere
of constant density grows only as m^(1/3) while its Schwarzschild grows much
faster as m^1.

Thus, as you pile up more and more of your identical 1cc cubes, the size of
the pile will grow more slowly than the size of its Schwarzschild radius. As
soon as you add enough cubes for the Schwarzschild radius to exceed the actual
radius, the system must inevitably form a black hole.

I _think_ that this behavior is directly related to the information density
limits that we've been talking about, but certainly the end result is the
same: piling up lots of similar stuff in one place will eventually lead to
gravitational collapse.

------
drakaal
You have a molecule. Something that will "stay put" probably written on a "2d"
surface like Graphene.

Graphene is composed of lots of little hexagons. Each side of the hexagon can
be broken and have an atom attached to it in "3d".

You have 6 sides and the angle break can go "up" or "down".

You can only use 3 sides however so that each hexagon has data and you can
tell unique data.

This gives you 3 positions in 3 states, Up, down, or Flat. 0.142 nanometers
per bond...

That's 27 states per hex, and 190 hexes per nanometer... 36,100 hexes per
square nanometer...

I'm sure I screwed up a calculation in there somewhere. But Based on current
tech this is my answer to what is possible to write. Now Reading might be a
bit harder at any speed... but hey this is all theory right?

~~~
vilhelm_s
One cool idea (I saw it on Charles Stross' blog) is "diamond memory": use a
diamond crystal, with two different isotopes of carbon for 1 and 0 bits. This
also doesn't feel _too_ unimaginable, in theory. According to Wolfram Alpha[1]
this gives 1.75*10^23 bits (20 zettabytes) per CC.

[1]
[http://www.wolframalpha.com/input/?i=number%20of%20atoms%20i...](http://www.wolframalpha.com/input/?i=number%20of%20atoms%20in%20one%20cc%20of%20diamond)

------
vinceguidry
This assumes that all information is stored physically.

Let's say you get a telegram. The telegram itself can contain maybe a couple
of paragraphs of text. You might think the bandwidth of the channel is at most
a kilobyte or so. But there's plenty of other aspects that can raise the
amount of information conveyed. Say you get one saying "Short Dow." Just going
by the content, you'd have no idea what it was saying. Is there a guy named
Dow somewhere that's short? But if you're a stockbroker, all of a sudden
there's a whole lot more info there. When you allow for context, information
density can approach infinity.

~~~
itcmcgrath
No, let me paraphrase what you just said.

'If you can store infinite information outside of a system, then you can
achieve infinite information density inside the system if you use outside the
system as 'context'. I hope that paraphasing makes it clear enough the problem
with this line of thinking. You have to include all the information when
calculating absolute density. In your case, the context has to be stored
somewhere too.

~~~
vinceguidry
If all information has to be accounted for and stored somewhere, and context
is part of the information, then you can't store any information without
storing all information, everywhere. Because every bit of information exists
inside the context of the entire universe.

~~~
bradleyland
You're getting very metaphysical here, but reality remains the same even if
you expand these principles to the universe. The rules of physics still apply.

~~~
gibwell
I think he has a point, although it's more about the semantics of the term
'information' density.

Shannon information is always measured relative to a receiving context in
which it the symbols are understood, and information content is related to the
inverse of the probability of observing a particular signal as assessed by the
receiver. So from that perspective vinceguidry has a reasonable point.

However really the question being asked is something more like 'data density'
and that is generally what people are talking about when the term 'information
density' is invoked.

Edit: I see that the original article does indeed refer to Data Density, and
that HN title is just wrong.

------
dizzystar
Can anyone put this in terms like 100 __100 Petabytes or something? Or is the
number so large that it really isn 't conceivable at this point?

I'll admit that I don't understand the problem completely, but isn't this
assuming the absolute maximum with little consideration of actual technology
limits? How much does it change when we consider the limits of technology and
our ability to store it on said technology?

~~~
sesqu
The number is far too large to have an SI name. Some people have put forward
suggestions for new prefixes, but they wouldn't help you understand the
number.

 _Edit:_ I suppose you could say 1 exayottayottabit, or 100 pebiyobiyobibytes,
per cc.

And yes, this relies on using qubits for storage, in greater densities than is
achievable.

------
galaktor
might be nitpicking a bit here, but "information" != "data". It is hard to
tell how little or much information I can extract from any amount of data.
data + context = information

------
MildlySerious
So the theories here assume qubits are the smallest unit of storage. I'm not
into physics, so how's the evidence they are really the smallest units
possible to store information as? After all, atoms were said to be the
smallest thing at some point.

~~~
maaku
Sure, if you postulate new physics anything is possible.

