Hacker News new | past | comments | ask | show | jobs | submit login
Bremermann's limit (wikipedia.org)
57 points by monort on April 22, 2016 | hide | past | favorite | 21 comments



This is an entertaining read:

Ultimate Limits to Computation http://arxiv.org/abs/quant-ph/9908043


It is always worth keeping Bremermann's limit in mind when anyone talks about general AI being impossible. We are so far from it (as humans) that to even have an opinion about what is possible seems silly.


I think having the opinion that AGI is possible is quite reasonable given the existence of humans. I happen to believe its' advent is inevitable. I don't see how humans won't come to invent an AGI at some point in the future. Consciousness is a physical information process taking place in a physical universe. Any other attempt at explanation is woo woo.


Intelligence and Bremermann's limit are pretty unrelated. Most people have what they consider to be intelligence and self-awareness with a system that uses a lot fewer atoms than are contained in this universe. The worry, with non-human intelligences, maybe trying to bootstrap themselves to be able to take advantage of this universe's computational capacity, is with what a person once apocryphally said: "The machine does not love or hate you, but you are made of atoms it can use for other things.”


It's not apocryphal; you can find the original on page 27 of this: https://intelligence.org/files/AIPosNegFactor.pdf


Well sure, but there is a much more direct refutation of the claim that no physical system can achieve human level intelligence: humans exist.


Bremermann's limit is rooted in the discrete bits of reality that we understand. What is silly is to assume we understand reality.


This sort of thing fascinates me.

If the universe is a simulation, is this not to be understood as the processor clock speed?

(I've not read much of Nick Bostrom's work nor do I claim to know much about reasoning that the universe is in fact a computer simulation.)


Not really. If we are inside a simulation running on some CPU in the "real world" and they move the simulation to a CPU that's half as slow, from our point of view nothing changes. From the point of view of whoever is running the simulation, it now takes twice as long to simulate a day, but the people inside it cannnot tell.


I don't think so; the passage of 'time' measured in the simulation could be quite a bit faster/slower than on the hardware underlying the simulation (I would assume quite a bit 'slower').

So if we were in a simulation, I'm skeptical that we could guess much about the properties of the 'host' system.


But we can determine the maximum speed of causality, can't we?

I was watching a (Bostrom, I think ?) presentation where the vocabulary used to explain Einstein's "c" (the speed of light) included words like "maximum speed of causality". That blew my mind. Because light moves as fast as it theoretically can (many Newtonians believed there was no ubound to the speed of light), light speed still cannot breach the absolute limit of 299 792 458 m/s. That speed limit is embedded directly into the action-reaction architecture of our universe.

(Anyway, didn't mean to hijack)


That's a super interesting point actually. I'd never thought of the "maxium speed of causality". Thanks for sharing!

We could determine the maximum speed within our terms, but again that doesn't necessarily translate to how things happen in the underlying substrate.

Thought experiment: let's imagine we create our own simulated environment and put a sentient AI into it (which is implemented in the physics of that environment). The AI gets curious and does some experiments and figures out the maximum speed that causality can propagate in it's environment. Suppose the speed of light in the simulation is as fast as the speed of light in our environment. So something takes 1 in the simulation and 1 second in ours. Cool. Now, suppose you and I as the developers put in a 'sleep' call into the event loop or whatever so that the simulation runs at 1/10 the speed of real time. Light in the simulation is now 10 times slower than it was by our clock time. Now the code operates considerably slower - but any entity in the code wouldn't perceive the slowdown because it has slowed down too in the same way.

Same thing - any slowdown in the simulation would not be noticeable to us because it's also a slowdown in us.


I've always found the idea of the universe being a simulation rather silly. There are estimated to be 10^80 particles in the observable universe.

Assuming our universe is being simulated on an earth sized computer (this is mind bogglingly ridiculous to me) and that the simulation meets Bremermann's limit (also ridiculous, since we're 40 orders of magnitude away according to this http://arxiv.org/pdf/quant-ph/9908043v3.pdf), then it would take on the order of a whole earth day to simulate a single time step.

Of course, perhaps the host universe doesn't have the same limitations. Such speculation seems pointless. Maybe solipsism is true. Maybe the simulation runs on magical fairy dust.



Actually, the simulation is running on a quantum computer.

https://www.youtube.com/watch?v=dEaecUuEqfc


Does Bremermann's limit account for quantum computers?


No, it doesn't. A quantum computer would make very short work of a 512 bit key.


That is kind of misleading statement. What quantum computers change is not how many operations per second you could do, but how many operations you need to factor large numbers.

I believe that Bremermann's limit should also apply to quantum computers, but you are right in that example given for breaking cryptography is way off if you allow quantum computers.


We're veering off topic, but quantum computing algorithms extend beyond just Shor's algorithm.


It does actually. Article and references both state quantum theory and relativity are the foundation in deriving the limit.


Sure, but the computation is classical, isn't it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: