
The Universe Is Not a Simulation, but We Can Now Simulate It - he0001
https://www.quantamagazine.org/coder-physicists-are-simulating-the-universe-to-unlock-its-secrets-20180612/
======
comboy
I'm so disappointed by quanta lately. Clickbaits and worsening quality.

> The Universe Is Not a Simulation

That's based on what? Author of the article opinion? That's cool but then it
should be "In my opinion.." and "here's my reasoning".

~~~
nothis
Is the "Universe is a Simulation" more than a dreamy internet theory, though?
Like, are astronomers and physicists taking it seriously? I don't know either
way, I just assumed the headline was basically saying "no, this isn't some
sensational new post about how we're all in a simulation, this is hard data
leading to realistic results". Kinda defensive more so than click-bait-y.

~~~
jerf
Yes, it is a serious theory, taken seriously by serious scientists, and
seriously investigated by them.

It can be hard to tell through the popular science stories, where the author
is typically unable to relay anything about this research without adding "woo
woo" sounds and wiggling their fingers, but it's a real theory.

One thing that may be worth remembering is that in the real theory, the
"simulation" is not necessarily running on "a supercomputer just like what we
have, only way bigger", which is actually a silly idea for various reasons,
and we also do not know that the simulation is literally simulating every
particle and every cubic inch of space in the most accurate possible way
(i.e., for all we know, far away galaxies could just be a few megabytes in
size, rather than fully simulated to the nth degree). The "Great Simulator" is
not required to run anything remotely resembling our physics. What seems an
absurdly large computation may be smaller than it appears, and may not be an
absurdly large amount of a universe running on fundamentally different
physics.

(Occasionally when I'm bored, I noodle around with a theory of cellular
automata in which the universe has infinite computation power by virtue of
being able to split a cell in half or quarters, that then runs at double-speed
relative to the outside. If this is done recursively the universe has
arbitrary computation abilities relative to the "top level". I've never worked
it into anything usable, but I think it can give a flavor of why we should not
be sure that our simulation host is necessarily as staggered by the
computation as we would be. We are not even capable of saying that they aren't
running an exponentially-expensive quantum approximation algorithm with the
entire universe actually arbitrary-precision integers under the hood. The
staggering amount of computation that represents may not bother them at all.)

It is not a theory that is, strictly speaking, falsifiable thanks to Church-
Turing equivalence, but there are ways in which it could potentially (to any
arbitrary degree of confidence you may want) be confirmed if it is true. To
take one degenerate case, if the Great Simulator broke silence and starting
wildly breaking the laws of physics to get our attention, that would certainly
put a lot of probability mass on the "we're a simulation" theory. There are
less extreme cases we could conceivably pick up on, that just fits into an HN
comment nicely.

~~~
olavk
> It is not a theory that is, strictly speaking, falsifiable,

So in what ways is it a "serious theory"?

~~~
jerf
Well, along with the religion case that whatshisface points out, there's also
the problem that _any_ theory of the nature of the universe has this problem.
The problem here is not with the theory of the universe per se, but the
fundamental limits of science itself.

Despite what seems to be a popular opinion, we are not actually obligated to
curl up into a ball on the floor and cry about our inability to know anything
just because science can't be definitive on some point. We just need to be
aware of the limits of our knowledge. We have no choice; even in just our day-
to-day lives, we are required to make all kinds of decisions for which science
is either quiet, insufficiently informative, or (and this one may _really_
hurt to think about too much) simply _wrong_.

~~~
olavk
> Despite what seems to be a popular opinion, we are not actually obligated to
> curl up into a ball on the floor and cry about our inability to know
> anything

I have no idea what you are referring to here. Can you be bit more specific
about what viewpoint or philosophy you are satirizing?

------
rbanffy
Actually, when you found out it was a simulation, we scrapped the current
data, we had to restore it to the last valid snapshot and adjust local
simulation parameters so that you wouldn't find it out. It's all reported in
the post-mortem of the incident. Service levels are now fully restored.

~~~
stabbles
There is another theory which states that this has already happened

~~~
rbanffy
Yes. We have some guard code in place for when this happens and it's getting
really good at catching it before or right after it happens and doing all the
rollbacks automatically, but they can't catch every possible situation.

It's now down to one minor outage every couple years. We are aiming at five
nines on the next major release.

------
decebalus1
Skimmed through the article and couldn't find any mention of the first part of
the title which was the initial reason for reading it as it's a subject of
great interest to me. I wasn't expecting this from quanta. Very disappointed.

~~~
psetq
It's still an extremely interesting article imo.

I sometimes read hn as the discussions here can add to articles on topics I
find interesting but have limited knowledge on, but the 3+ top comment threads
(so far) discussing the title and not content of this article is somewhat
disappointing.

~~~
decebalus1
I'm sorry you feel disappointed. But do you also feel like you've been
bamboozled? Because this is how I feel after reading the whole article
thinking that somebody managed to refute the 'simulation hypotheses'.
Interesting article, of course, but I would have not read it at all if the
titles was accurate. So I feel bamboozled.

------
gattr
In Charles Stross's "Accelerando" (strongly recommended!) astronomers suspect
that the inhabitants of Andromeda Galaxy converted all its baryonic matter to
computronium (here: configuration of matter capable of the most efficient
computation as per the laws of physics) and are using it to run a massive
side-channel attack, trying to verify if the Universe runs on a virtual
machine (and I guess maybe break out of it).

On a different note, here's a sobering thought for the Universe-as-simulation
optimists: while it's possible the simulation's Creators wish us well, e.g.
will store your soul/mind and let you live carefree in a Paradise... it might
rather work like our current massive simulations (e.g. the Millennium Run
[0]). It generates enormous amounts of data, but only some small amount (like
a state snapshot every 100M years) is actually captured and analyzed. I.e. the
"dumb" computing power at the Creators' disposal is far greater than their
mental capacity. The whole existence of humanity might go unnoticed, just a
side-effects of the simulation's fidelity.

[0]
[https://en.wikipedia.org/wiki/Millennium_Run](https://en.wikipedia.org/wiki/Millennium_Run)

------
ur-whale
Is it just me or all that talk of "codes" combined with all-uppercase names is
exuding a potent whiff of fortran :D ?

And if so, I'm wondering if any of that stuff runs on GPUs or if we're still
talking about "traditional" supercomputers?

Anyone knowledgeable in the field care to share?

~~~
freshhawk
Not knowledgeable in the field really, but I have a bunch of old school
friends who do this for a living.

Your ability to detect Fortran is apparently excellent.

------
gojomo
There is nothing in the article to refute the conjecture that our universe is
a simulation – and in fact the rapid progress it reports in simulating
simplified universes could be seen as further support for the idea that
universes like ours may be simulated.

------
NVRM
Because you can chroot doesn't mean that you are not in a virtualized
container. Actually this tend to say the opposite...

------
____a
Is there a good book that covers primarily or at least secondarily the ideas
behind the universe being a simulation? I'd like to learn more.

~~~
psetq
There's several links in the wikipedia article which can make for good jumping
off points:

[https://en.wikipedia.org/wiki/Simulation_hypothesis](https://en.wikipedia.org/wiki/Simulation_hypothesis)

------
rococode
Could someone elaborate on how accurate the results of these simulations are
generally considered to be by the broader scientific community? It seems to me
that the universe is so insanely complex that any simulations we can currently
make would be fairly inaccurate because of lack of knowledge. Couldn't that
cause a problem of building a whole base of theory on unproven guesses? I
guess the researchers all know that but I'm just curious what the actual
attitude towards this kind of work is.

------
bpicolo
> The Universe Is Not a Simulation

Proof left as exercise for the reader?

------
harshalizee
Is there any of the code for the simulation or similar available in the public
domain? Would love to be able to take a look.

~~~
psetq
Googling the names given in the article turns up some interesting links:

[http://gasoline-code.com/](http://gasoline-code.com/)

[http://bluetides-project.org/](http://bluetides-project.org/)

------
stillbourne
Not down to the planck level. Simulating the universe at the planck level
would require at least 10^157 bytes of storage.

~~~
gattr
Right, but adaptive refinement is bread and butter of today's physics
simulations. I imagine that in order to simulate our Universe, you'd go down
to elementary particle level only when some of the simulated folks run their
accelerators, etc. But for an in-simulation billiard game you'd use a plain
impulse-based rigid body dynamics.

~~~
stillbourne
The problem isn't particles, its particle interactions and the orders of
magnitude in time these events occur at. There is a reason why planck length
is shortest theoretical measurable time interval. I'm sure you are aware of
the equation E=mc^2, at the planck length the value of c=1. Therefore a planck
length, cubed, of the universe can be considered to be 1 bit, for the purposes
of storage for a universe, or at least this one.

~~~
raattgift
I think you should at least check out the relevant wikipedia page here

[https://en.wikipedia.org/wiki/Planck_length](https://en.wikipedia.org/wiki/Planck_length)

to see how strongly it conflicts with what you have written about it.

Among other things catalogued there, the Planck length has dimension of
[Length], rather than [Time], and per Bekenstein [1973]
(doi:10.1103/PhysRevD.7.2333) "1 bit"[0] relates to the minimal increase in
the area of the event horizon of a hairless, monotonically growing,
stationary, spherically symmetric (or with some further assumptions,
axisymmetric) black hole into which matter is being thrown. This remains
contentious because black hole mass is continuous and not discrete (one can
throw almost arbitrary wavelength photons in, for instance).

By contrast with what you wrote, if you look at it (e.g. via sci-hub)
Bekenstein's PRD paper sure doesn't assert that a horizon area on the order of
Planck area would contain "one bit", especially as he was aware of the content
of the about-to-be-published Nature paper, Hawking [1974]
(doi:10.1038/248030a0), which details the "explosion" of black holes with
small horizon areas (and was additionally the first Hawking radiation paper).

Finally, if you don't like wikipedia and sci-hub, your favourite search engine
will surely supply numerous discussions among working physicists (including
peer-reviewed publications) of the Planck length and whether it is physically
significant in any context other than Bekenstein's or close relatives (e.g.
Loop Quantum Gravity requires that all surface areas are quantized, although
not to integer multiples of the fundamental quantum, which in turn is roughly
on the order of the Planck length cubed).

\- --

[0] A modern semiclassical statement of this is: to leading order, the entropy
of a black hole is proportional to its event horizon area at one nat per four
Planck areas.
[https://en.wikipedia.org/wiki/Nat_(unit)](https://en.wikipedia.org/wiki/Nat_\(unit\))

~~~
stillbourne
[https://arxiv.org/pdf/hep-th/9409089.pdf](https://arxiv.org/pdf/hep-
th/9409089.pdf) (DOI:10.1063/1.531249) cited 3221 times:

Let us begin with a maximally dense object, a black hole. It will be assumed
that the entropy is found on the horizon and that no more than one bit per
Planck area can stored there.

The only thing I stated incorrectly was planck cubed instead of square.

[https://journals.aps.org/rmp/abstract/10.1103/RevModPhys.74....](https://journals.aps.org/rmp/abstract/10.1103/RevModPhys.74.825)

The metaphorical name of the principle (’t Hooft, 1993) originates here. In
many situations, the covariant entropy bound dictates that all physics in a
region of space is described by data that fit on its boundary surface, at one
bit per Planck area

[https://lib.ugent.be/fulltxt/RUG01/001/787/483/RUG01-0017874...](https://lib.ugent.be/fulltxt/RUG01/001/787/483/RUG01-001787483_2012_0001_AC.pdf)

3.5 Holographic screens

An important implication of the holographic principle is that all the
information in a given region can be encoded on a surface B, at a density of
one bit per Planck area. We can now ask ourselves if the information contained
in an entire spacetime can be encoded on a certain hypersurface, which we will
call a screen.

What's that you were saying? I can't hear you over the sound of how right I
am.

And why is a planck area one bit?

Because at the planck length, h=G=c=k=1. Where h is the Planck constant, G is
Newton's constant, c is the speed of light and k is Boltzmann's constant. Or
if you like some Einstein. E=mc^2 reduces to E=m

~~~
raattgift
I'm not sure what you're trying to convince anyone of, other than that you
have read at least some of several documents dealing with gravitational
physics. That's promising, so I'll react. I'm not a string theory fan, but
don't let that discourage you from digging deeper into the field.

> "no more than one bit per Planck area can be stored there"

Sure, that's a postulate to help clarify the Susskind's string theory argument
about the information content of his 2d holographic screen at infinity from
the black hole with all of the above in a 3d spacelike hypersurface at
t=const. Being a postulate, the sentence fragment you quote is not proven in
the paper, it's just assumed.

Changing the number of states at the horizon in that spacelike hypersurface to
an arbitrary finite number does not really frustrate his gravitational
argument (but see below about his matter GUT argument), while arguments about
the upper and lower limit of states at the horizon are available in Bekenstein
[1973] op. cit. and many subsequent papers.

It is perfectly normal for gravitational arguments to set c = G = 1, and
possibly normalize some other terms to unity too; the choice of what to set to
unity depends on the trade off between ease of writing down formulae and the
difficulty of checking their dimensionality. The reason Susskind uses Planck
units, and relevantly to our discussion the Planck length, is because it's
convenient in analogies as he develops his argument further down the paper
using units where the string length is set to unity.

(The majority of the Susskind paper is heavy (pardon the pun relating to
scaling of interaction with momentum) with non-gravitational string physics,
and I have no expertise on that, but his gravitational arguments are low
energy ones, and there I am comfortable).

> The only thing I stated incorrectly was planck cubed instead of square.

You wrote, "shortest theoretical time interval". The word "time" is incorrect
since it is a unit of length (this matters in a Lorentzian spacetime), but
changing it to "spatial" does not let you claim that the Susskind paper
supports your statement at all, for the reasons above.

You can certainly make arguments about states on a stretched horizon (Susskind
does in the paper you found) -- complementarity is wildly popular with string
theorists. However, their entropy:information argument, as also repeated in
the Bousso paper and the master's degree thesis you found, does not set a
minimum length scale rather than fixing a limit on the number of microstates
you can squash into an area before the macrostate resembles a hairless black
hole, the idea being that any sparser region's macrostate grows gently, and
moreover that you can with some care use the macrostate to describe all the
internal microstates (which is the core content of SUGRA theories,
essentially, where the care one takes is in the choice of a conformal field
theory to represent the evolution of the macrostate).

> "a planck area [is] one bit [b]ecause at the planck length, h=G=c=k=1"

No. More on that in the paragraph after the following one.

You also wrote "the equation E=mc^2, at the planck length the value of c=1",
which is a bit confused. All Planck units set five constants, c = G = hbar =
k_e = k_B = 1. Any quantity measured in Planck units will have c = 1 (and also
Boltzmann's constant = 1 and the Coulomb constant = 1 etc.). The partial
dispersion relation you provide has nothing to do with it, other than that you
can solve it for an appropriate system in Planck units just as in S.I. or cms
units, and you can simplify by dropping the term normalized to unity, just as
your version has already simplified from the fuller special relativistic
relation by normalizing momentum to zero.

In the stringy arguments, the reason there is supposedly four nats or one bit
or a small finite number of microphysical degrees of freedom or whatever per
unit area on the horizon of a black hole is complementarity described in
another Susskind paper: [https://arxiv.org/abs/hep-
th/9306069](https://arxiv.org/abs/hep-th/9306069) (amusingly it should be
"coarse graining" in the Abstract, of course, as it is on e.g. p. 3 just above
Postulate 3 and several more times throughout the paper; the published PRD
version has the same error in the Abstract).

I do not find the BH complementarity model especially convincing as it was
contrived to save information from being lost in traditional black holes but
does not appear to do the job, appearing to require replacement of black holes
with fuzzballs or some other complementarity-preserving resolution of the AMPS
problem. Their cosmological model is even shakier, since it is mostly tested
on a non-infinite (but large) distance boundary on anti-de Sitter space, with
some arguments about how a series of slices of AdS space can resemble a series
of slices of a space much more like our expanding universe. The master's
thesis you found is interesting in that it tries to tackle the physics in
(among others) de Sitter space, which is a fair approximation of our universe
at late epochs. YMMV.

~~~
stillbourne
[https://en.m.wikipedia.org/wiki/Planck_time](https://en.m.wikipedia.org/wiki/Planck_time)

[https://en.m.wikipedia.org/wiki/Orders_of_magnitude_(time)](https://en.m.wikipedia.org/wiki/Orders_of_magnitude_\(time\))

~~~
raattgift
Even though your two-link reply seems like an obvious "I don't want to talk
any more", I'll leave you with three things.

Firstly, there is nothing special about natural units compared to any other
system of units, except that some popular formulae can take on especially
simple forms in that system of units, provided one takes care not to lose
dimensionality:

[https://en.wikipedia.org/wiki/Dimensional_analysis](https://en.wikipedia.org/wiki/Dimensional_analysis)

You (still) are confusing two different dimensions, length and time. Again,
this matters in a spacetime like ours, where the difference in dimensionality
gives us a system of causality[1] and a reasonable setting in which to do
time-series physics[2]. It also matters when switching from natural units to a
different system of units in which c != 1, such that we cannot omit the
conversion constant (or change of sign when calculating spacetime intervals
[3]).

Changing a physical system like a black hole from one system of units to
another does not change the physics of the system, just how we describe them.
It is enlightening to do so in general, because it is easy to fall into the
trap of treating as physical a condition that vanishes upon switching to a
different set of units. A quantity that is a ratio of two dimensionful
quantities (area and information) surviving across changes of units on one or
both dimensions may be interesting. In the case of black hole complementarity,
which earlier you found links for, it's a (near-horizon) density that
(according to some string theorists) corresponds directly with a (interior)
density (states / volume) taking into account the Ricci tensor's effect on
interior volume but removing the physical singularity through string
interactions in a "fuzzball" or something similar[4].

Secondly, your second link says of Planck time, "Presumed to be the shortest
theoretically measurable time interval (but not necessarily the shortest
increment of time - see quantum gravity)", which does not really support your
position. Rather than reinvent the wheel for you in qualifying that, which I
think would be wasteful given your previous few replies, I'll direct you to
physics se and in particular to Lubos Motl's 85 point comment at :
[https://physics.stackexchange.com/questions/9720/does-the-
pl...](https://physics.stackexchange.com/questions/9720/does-the-planck-scale-
imply-that-spacetime-is-discrete)

Finally, I don't know what you're trying to accomplish although it seems
you're set on convincing yourself and perhaps some people who know even less
physics than you do that you know what you're talking about. Hopefully it's
more optimistic and instead you're trying to learn more than you already do
and are just going about it inefficiently.

\- --

[1]
[https://en.wikipedia.org/wiki/Causal_structure](https://en.wikipedia.org/wiki/Causal_structure)

[2] [https://arxiv.org/abs/1505.01403](https://arxiv.org/abs/1505.01403)
(section 4) - sadly the wikipedia has only scattershot coverage of it in e.g.
the short and/or not very accessible pages on the ADM and BSSM formalisms, and
on Canonical Quantum Gravity (which is a quantization of the Hamiltonian
formulation that is "only" an effective field theory good to one loop, and
provokes the question of "the problem of time"). Unfortunately most
introductory material about 3+1 formalisms in general is already highly
technical (e.g., textbooks aimed at graduate students).

[3]
[https://en.wikipedia.org/wiki/Metric_signature](https://en.wikipedia.org/wiki/Metric_signature)

[4]
[https://www.wikiwand.com/en/Fuzzball_(string_theory)](https://www.wikiwand.com/en/Fuzzball_\(string_theory\))

------
krrishd
this is highly naive of me to ask, but "in theory" if we could simulate the
universe, would familiar stuff like living organisms/'societies'/'humans'
potentially just emerge within such simulations?

just super cool to think about even if not the case/unrealistic.

~~~
AnIdiotOnTheNet
Depends on how you define living, really. It's a philosophical problem, what
makes one collection of matter "alive" and another "not alive"?

Since at least Thomas S. Ray's work on Tierra in the 90s we've been creating
universes in which things that could be called "life" have developed.

~~~
rotexo
My understanding of Tierra and its successors like Avida are that the "digital
organisms" are written beforehand and then systematically (pseudorandomly)
mutated. Because the initial "organisms" are written, it would be a stretch to
say that "life" _developed_ in these systems. A statement like "the process of
evolution by natural selection was simulated in an artificial system" would
seem to be more accurate.

~~~
AnIdiotOnTheNet
Yes, this is true, however I think the distinction is irrelevant. The initial
VM was created with the intention of encouraging this evolution, had the
original intent included spontaneous emergence of a-life as a goal I'm pretty
confident that would have happened.

~~~
rotexo
Seems like that's an assertion begging for an experiment. Take the software
that manages experiments in Avida or Tierra, but don't supply pre-written
organisms. Bit-flip randomly and see if something develops. Unfortunately I
don't know the implementation details of those systems at all, so I can't say
whether or not that would be feasible.

