“Who can name the bigger number? Whoever has the deeper paradigm.”
In 2001, David Moews ran a BIGNUM BAKEOFF competition on the comp.lang.c newsgroup, where each entry had to be a C program (512 non-whitespace characters or fewer) returning a number from main(). The winner is the program that returns the largest number, assuming an arbitrarily-large int type.
After reading Scott Aaronson's wonderful essay, you won't be surprised to hear that the contest was won by a theoretical computer scientist, Ralph Loader. (Second place was taken by Heiner Marxen, who is mentioned in Scott's article in connection with the Busy Beaver problem.)
A coworker suggests a Whitespace[1] interpreter which applies the recursion theorem and then interprets a Whitespace program embedded in the source code, characters which do not count towards the size limit. We're not sure yet whether that can be done in 512 characters.
I find it hilarious that three people thought to return the highest possible number by underflowing unsigned ints (that are being treated as arbitrarily large ints). It's a funny hack even if it doesn't teach you anything like Loader's does.
I don't see how that is relevant when the problem assumes an arbitrarily large int-type. It's trivial to return the largest possible value (INT_MAX) for any fixed-size int-type, but a such does not exist.
Basically, many posters are missing the point that it's a mathematics challenge, not a C-challenge, which is excusable since it's posted in a C-group.
What I'd want to say here is "the largest number nameable using 1000 symbols in second-order logic" (not original, I think someone else won a contest that way). But the fact that someone can ask "Which formulation of second-order logic?" probably defeats this unless you can also, in those fifteen seconds, cite a standard formulation of second-order logic and a standard axiomatization of number in that formulation, otherwise it's an ambiguous specification. HOL is a standard and formal formulation of higher-order logic that I can think of in fifteen seconds, but I wouldn't be able to think of a standard formulation of Peano Arithmetic that was already in HOL.
Actually, now that I think of it, "the largest number nameable using 1000 symbols in HOL plus a successor symbol" might work, since you can define addition, multiplication, etc. in higher-order logic using only a successor symbol.
Interesting fact: This is a fundamental jump up from first-order logic, which is a fundamental jump up from Busy Beaver, and then once you're there, it's operating at the highest level of abstraction that I know about or have ever heard of. There are things you can do to make that number bigger - picking a different ordinal for the order of logic, recursing over how you get the number of symbols - but they all amount to saying "plus one", and not jumping to a qualitatively different level of abstraction. Once you abstract over second-order logic you are, so far as I know, done.
I don't know what qualifies as a "qualitatively different levels of abstraction", but for all practical purposes, you're never done. Given a formal system for building numbers, you have a sequence B(n) of largest numbers created with n symbols in that formal system. You can always step outside that formal system and make a construction that leaves someone stuck in the formal system in the dust. For example, take B(B(n)). You could then formalize this process of stepping outside of formal systems, only to find yourself in a new formal system. The only limits on how fast you can "go meta" are limits of your mind.
You could always add one to the number, too. Turning B(n) into B(B(n)) is pretty much the same thing, it just takes B(n) and makes it the base function of ordinal recursions you already know how to construct. It doesn't jump outside the system by very much.
Going meta is really hard work. When people think they're "jumping out of the system" over and over again, they're usually actually playing in a pretty small sandbox. If you take someone who's naive about going meta and ask them to construct a large number, they often don't get anywhere close to Ackermann numbers. They think they're being brilliantly creative and recursing at the speed of light, while actually spending huge amounts of complexity to create much smaller numbers than a mathematician could describe in a few lines of code. Similarly, someone could play around all day with grand hierarchies of ordinal halting oracles, or "jump out of the system" for a billion years on end by applying the functions to themselves, and never get anywhere close to the size of the numbers that you could talk about by making the jump to first-order logic.
And after you make the jump to second-order logic, I don't know of any more other jumps like that of comparable size.
My point is that if two mathematicians play this game, there is still lots of room to be clever. Even though both will know the idea of looking at numbers nameable in a some formal logic, the more clever of the two will beat the one with fastest handwriting (I'm assuming hours or days - not seconds - to compete).
(In case it happens to be you and not I who missed something: the said number is formally specified though uncomputable, just as much as saying BusyBeaver(100); it is not paradoxical like using English to talk about English; and it is also vastly larger than anything you can express compactly by talking about Busy Beavers and oracle machines, which was the top level of the original article.)
What surprises me is that hacker news still let these dupes get posted. I was under the impression that a URL once submitted cannot be resubmitted.
I've submitted duplicate articles before (unknowingly) and it simply acts as an upvote for an article that has already been submitted.
I was thinking perhaps it lets you re-post after a certain predefined amount of time, but between two of the postings, there were only 18 days.
If you add URL parameters, you can bypass the uniqueness requirements. Theoretically, you could submit the same URL an infinite number of times, with different combinations of parameters and never trigger the uniqueness check.
Surely not every day, but frequently enough, I read a submission and related comments on HN and say to myself, "Ahhh, this is why I continue to come to HN!".
Aaronson is talking about Graham‘s number when he writes:
“Recondite as it seems, the Ackermann sequence does have some applications. A problem in an area called Ramsey theory asks for the minimum dimension of a hypercube satisfying a certain property. The true dimension is thought to be 6, but the lowest dimension anyone’s been able is prove is so huge that it can only be expressed using the same ‘weird arithmetic’ that underlies the Ackermann sequence. Indeed, the Guinness Book of World Records once listed this dimension as the biggest number ever used in a mathematical proof.”
That‘s about a third of the way through the article. The numbers get a lot bigger after that.
As a footnote, the true dimension is no longer thought to be six — Aaronson’s essay was written in 1999 — and is now known to be at least 11. The Wikipedia article linked above has a reference for this improved lower bound.
Let g = Graham's number. Then use Knuth's up arrow notation. Then (and this is perhaps cheating) use a non-computable function like the busy beaver function, which grows faster than ANY computable function. That last step may in fact not be cheating, because it might be possible to prove a lower bound and therefore prove that it's higher than the opponent's number.
There's no end to that. You could define a function BZ(n) that equals BB(BB(BB...(g↑↑↑...g)...)) with n nested BB's and n up arrows. Then you could define a function RBB(n) that equals BZ(BZ(BZ...(g↑↑↑...g)...)) with n nested BZ's and n up arrows.
A busy beaver is a Turing machine that runs for more steps than any other machine (having the same state machine size) but eventually halts (rather than looping forever). The busy beaver function measures the run times of all busy beavers, the maximum number of steps you can get out of each state machine size. Apparently Radó proved this value eventually grows larger than any computable function.
The busy beaver function doesn't run on a computer. We only know Σ(n) for n < 5. I think you are missing out on something extremely interesting—that there are limits on what computers can compute, even given unbounded resources and time.
One of the fundamental points of Kolmagorov's algorithmic information theory is that the function for the largest number algorithmically representable by n bits of information is incomputable.
I have no idea how they compare. The former is incomprehensible and the latter is both incomprehensible and non-computable. My intuition tells me the latter is much larger, because a computer could "easily" calculate the former.
> Be precise enough for any reasonable modern mathematician to determine exactly what number you’ve named, by consulting only your card and, if necessary, the published literature.
but there isn't enough entropy left before cosmic heat death to evaluate 9^(9^9).
Just for fun, here's my shot: The number of steps required to compute the solution to the travelling salesman problem for each quark of every subset of every permutation of quarks in the universe in space-time.
Think of it this way. Imagine your scenario, like you just laid out. Now write a program to actually find the number in question, including all input (in this case, some representation of the number you gave, which itself can be encoded as something less than the raw base 2 representation). Now, convert that program to the most efficient possible representation in the Turing Machine encoding in question, which for these sorts of problems are often surprisingly small.
Your problem fits into a small handful of states, a few tens tops, and relatively small "input" too (as encoded by the states in some manner). Therefore BB(a few tens) is at least that large. In fact, that's an unbelievable underestimation of BB(a few tens).
While we humans are thinking we're all clever stacking exponentials on exponentials, the Busy Beavers are off doing things we can't even conceive. Arrow notation and hyper arrow notation and every other such bit of notation are all very, very small functions, and therefore well covered by BB(very small); what concept or concepts does BB(50) exploit? Certainly nothing that would fit into English very comfortably.
I think this helps a human to grasp BB() at least a little; the cleverest, biggest numbers we've ever managed to express without the BB notation are usually in the form of functions quite easy to express with Turing Machines of not that many states. BB grows fast.
I'm not sure if quarks have "positions" in the exact sense of the word, so I don't think there's a metric you could apply here to get distances for which to solve TSP.
Suppose there was a metric between quarks. To be clear, are you saying: take all quarks in the universe. Take their powerset P. For each set S in P, create the complete graph K_{|S|} where edge lengths are distances. Let TSP(x, K_{|S|}) be the number of steps required to solve TSP. Sum TSP(K_{|S|}) for all S in P.
Let n be the number of quarks in the universe. The size of the powerset is 2^n. For each S in P, since TSP is solvable in exponential time, we'll upper bound the number of steps by 2^n. In other words, for each powerset, you're cost for solving TSP is at most 2^n. Therefore, the number of steps is at most 2^n * 2^n = 2^(2n), which is far smaller than say Ackermann(6, 6).
"The traveling salesman problem asks for the shortest route connecting a set of cities, given the distances between each pair of cities. The rub is that the number of possible routes grows exponentially with the number of cities."
Is it just me or does the traveling salesman problem have (cities-1)! solutions? thus it is not exponential? and is in fact far more annoying?
HA! 2^3!^4!^5!^6!^7!^8!^9!^10! ;-) (2! = 2) and (2^x) > (x ^2) where x > 3.
Then again 2^3^4^5^6^7^8^9^(10!!!!!!!!!) is larger. Also if it's it's simply a question of symbols 9!! is larger than 9^9 so 9 followed by n ! is probably your best bet without going into higher level math like BB(BB(...(9))...)
Then the best I can come up with on no notice is 2 ↑^(4 ↑^(6 ↑^(8↑^(9↑10)) 7) 5) 3.
That would be trivial to increase substantially by using some other notation for the hyperoperation, but I'm going with the presumably most familiar one. I'm sure there are bigger numbers - specifically, Wikipedia mentions Conway chains, where a→b→c = a ↑^c b, and so 2→3→4→5→6→7→8→9→10 should be quite large indeed.
Excellent! I hadn't thought about it, but at a glance I'm pretty sure your ordering and grouping does generate the maximal result. However, the parent comment said as much punctuation as you like, which allows for unbounded expressions with unary operators like the factorial!
It looks like many commentators did not bother to read the full article.
Neither did I, but I remember not to use "infinity" and "number of atoms in universe".
My daughter at bedtime says she's giving me infinity times infinity time infinity kisses. That's the biggest number, I don't care what any of you math whizzes have to say about the matter. ... OK, let's call it the best not the biggest.
In 2001, David Moews ran a BIGNUM BAKEOFF competition on the comp.lang.c newsgroup, where each entry had to be a C program (512 non-whitespace characters or fewer) returning a number from main(). The winner is the program that returns the largest number, assuming an arbitrarily-large int type.
http://groups.google.com/group/comp.lang.c/browse_thread/thr...
After reading Scott Aaronson's wonderful essay, you won't be surprised to hear that the contest was won by a theoretical computer scientist, Ralph Loader. (Second place was taken by Heiner Marxen, who is mentioned in Scott's article in connection with the Busy Beaver problem.)
Ralph Loader's remarkable program is at http://homepages.ihug.co.nz/~suckfish/busy/reduced.c, and the documented and unmangled version in http://homepages.ihug.co.nz/~suckfish/busy/busy.tar.gz
David Moews' discussion of the entries is at http://djm.cc/bignum-results.txt