What I'd want to say here is "the largest number nameable using 1000 symbols in second-order logic" (not original, I think someone else won a contest that way). But the fact that someone can ask "Which formulation of second-order logic?" probably defeats this unless you can also, in those fifteen seconds, cite a standard formulation of second-order logic and a standard axiomatization of number in that formulation, otherwise it's an ambiguous specification. HOL is a standard and formal formulation of higher-order logic that I can think of in fifteen seconds, but I wouldn't be able to think of a standard formulation of Peano Arithmetic that was already in HOL.
Actually, now that I think of it, "the largest number nameable using 1000 symbols in HOL plus a successor symbol" might work, since you can define addition, multiplication, etc. in higher-order logic using only a successor symbol.
Interesting fact: This is a fundamental jump up from first-order logic, which is a fundamental jump up from Busy Beaver, and then once you're there, it's operating at the highest level of abstraction that I know about or have ever heard of. There are things you can do to make that number bigger - picking a different ordinal for the order of logic, recursing over how you get the number of symbols - but they all amount to saying "plus one", and not jumping to a qualitatively different level of abstraction. Once you abstract over second-order logic you are, so far as I know, done.
I don't know what qualifies as a "qualitatively different levels of abstraction", but for all practical purposes, you're never done. Given a formal system for building numbers, you have a sequence B(n) of largest numbers created with n symbols in that formal system. You can always step outside that formal system and make a construction that leaves someone stuck in the formal system in the dust. For example, take B(B(n)). You could then formalize this process of stepping outside of formal systems, only to find yourself in a new formal system. The only limits on how fast you can "go meta" are limits of your mind.
You could always add one to the number, too. Turning B(n) into B(B(n)) is pretty much the same thing, it just takes B(n) and makes it the base function of ordinal recursions you already know how to construct. It doesn't jump outside the system by very much.
Going meta is really hard work. When people think they're "jumping out of the system" over and over again, they're usually actually playing in a pretty small sandbox. If you take someone who's naive about going meta and ask them to construct a large number, they often don't get anywhere close to Ackermann numbers. They think they're being brilliantly creative and recursing at the speed of light, while actually spending huge amounts of complexity to create much smaller numbers than a mathematician could describe in a few lines of code. Similarly, someone could play around all day with grand hierarchies of ordinal halting oracles, or "jump out of the system" for a billion years on end by applying the functions to themselves, and never get anywhere close to the size of the numbers that you could talk about by making the jump to first-order logic.
And after you make the jump to second-order logic, I don't know of any more other jumps like that of comparable size.
My point is that if two mathematicians play this game, there is still lots of room to be clever. Even though both will know the idea of looking at numbers nameable in a some formal logic, the more clever of the two will beat the one with fastest handwriting (I'm assuming hours or days - not seconds - to compete).
(In case it happens to be you and not I who missed something: the said number is formally specified though uncomputable, just as much as saying BusyBeaver(100); it is not paradoxical like using English to talk about English; and it is also vastly larger than anything you can express compactly by talking about Busy Beavers and oracle machines, which was the top level of the original article.)
Actually, now that I think of it, "the largest number nameable using 1000 symbols in HOL plus a successor symbol" might work, since you can define addition, multiplication, etc. in higher-order logic using only a successor symbol.
Interesting fact: This is a fundamental jump up from first-order logic, which is a fundamental jump up from Busy Beaver, and then once you're there, it's operating at the highest level of abstraction that I know about or have ever heard of. There are things you can do to make that number bigger - picking a different ordinal for the order of logic, recursing over how you get the number of symbols - but they all amount to saying "plus one", and not jumping to a qualitatively different level of abstraction. Once you abstract over second-order logic you are, so far as I know, done.