Bellard's  2009 record was much more impressive, because he used a clever formula to break the existing record with a mere (albeit beefy) desktop computer: https://bellard.org/pi/pi2700e9/
The record he broke with his desktop PC was made using a supercomputer cluster.
 If somebody is not familiar with him, he is also the original author of QEMU, ffmpeg and the Tiny C Compiler.
Have there been other improvements in single-core clock speeds, multi-core performance, or any other CPU hardware components to make running this noticeably faster on high-end consumer hardware?
They basically pulled more machine to compute more. nothing really impressive. pretty much any dev with that computing power could have done it
Keeping a single server online for 111 days straight at full CPU and RAM usage over 96 cores and 1.4TB of RAM is a good start. The fact that such a machine exists is already mind-blowing. Then add 25 more nodes running iSCSI, all out 24/7 for 111 days. Hell, just mounting 240TB on a single system is a good stunt, go ahead and try it and let me know it's not "complex".
And your last point kind of IS the point of their marketing: any dev could do it if they have the skill, and they'll rent you the hardware.
Compare this with the Chudnovsky brothers, who built their pi calculating "supercomputer" from commodity parts in their NY apartment, back in 1992: https://www.newyorker.com/magazine/1992/03/02/the-mountains-...
(and being mathematicians, they also discovered novel formulas for speeding up their calculation)
There are three kinds of mathematicians, those that can count and those that can not.
This paper claims you need 2/pi accurate to 1144 bits which is about 345 decimal digits:
It is still not clear whether Pi is a normal number or not . Such calculations could in principle give an insight here.
From cryptography standpoint, there is always a need to find bigger prime number. I wouldn't compare this with a Pi.
The large prime numbers needed for cryptography are a few hundred digits long. The are generated by picking a random numbers and checking it’s neighbors for primality.
The largest prime numbers that have been discovered have millions of digits. Finding a larger prime would have no effect whatsoever on our ability to quickly generate primes with a few hundred digits.
Contradicts the Google blog a little, especially where he points out that they hit performance issues with live migration (Google said it worked fine without impact on the application.)
"The numbers and rests in the formula translate to 16th notes on the kick drum, and 16th note rests. There is no kick drum beats where there are snare drums.
With the decimal point BEFORE the number, and starting with the first number, move that many decimal points to the right and insert that many 16th note rests. Use one 16th note rest to divide the numbers you passed (when applicable). Continue on throughout the rest of the figure. No repeats."
The details of the video have the full explanation
I have the "dots" one. https://fineartamerica.com/featured/768-digits-of-pi-up-to-f...
Or is my understanding of what constitutes an irrational number outdated? Is there another definition that precludes even looking for repetition in hopes of finding a denominator?
But if you take any length pattern of digits, it would repeat an infinite number of times.
Let's take a one digit pattern, say '5'. Since the digits of pi continue forever, there would be an infinite number of '5's.
Now consider a longer pattern '53'. Since the digits of pi continue forever, there would be an infinite number of '53's. In fact, each 53 will be from one of the infinite number of '5's in the previous pattern '5'.
Now consider a longer pattern '537' . . .
. . . to continue . . .
It was long ago when I read Contact (the book), so I hope I don't misremember this too badly. At the end of the book the main character was given a budget, lab, resources, etc. They were working on looking for a message in the digits of Pi. Eventually her beeper beeped and they had found one! It must be woven into the fabric of the universe.
I think any sequence of digits that had any kind of message you are going to eventually find in Pi. Just like, if you look long enough you'll find a 5. If you keep looking you'll eventually find a 53. Keep looking, you'll eventually find a 537. Etc.
[On the other hand, almost none of the numbers someone on the street might name are normal, so ...]
DM -> 3392/5550 ~= 61.1%
MD -> 2158/5550 ~= 38.9%
Note: I ignored both green and red regions that have both "DM" and "MD" in their format.
So it is definitively not the majority of people. Using the word "some" instead would have been better, but "Many" is not totally wrong… I guess.
The rest of the world doesn't. The most popular format is Day-Month-Year, followed by Year-Month-Day.
For example. Amazing.
(1) eliminates any ambiguity regarding interpretation. You can have an instance of a server of an unknown provenience and/or regional settings and still count on yyyy-MMM-dd to be correctly recognized.
(2) on the other hand has a nice feature of being sortable and can also be stored as an integer value if needed.
The US standard of MM/dd/yyyy is ridiculous. But it's just one of many and I'm not going to fix the world ;)
On the main topic: if anyone needs to calculate the circumference of the Universe (radius: 50bn lightyears ish) with the accuracy of the Planck distance (approximately 1.6 x 10^-30 meters), they need less than 65 digits of Pi to do so. So, as already stated, it's just a PR stunt, nothing more.
yyyy-MMM-dd is not (2019-MAR-14)
> Yee independently verified the calculation using Bellard's formula and BBP formula
The hard part would be finding what you want to send though...
I will point out you're thinking of it incorrectly, though. The challenge with compression isn't to "reach far into pi". The challenge is to reach correctly into pi. A double can identify 2^64 places to reach into pi. It doesn't really matter how you specify those 2^64 locations, that's all it can do, by the simple argument that one double can't specify more than one location.
The location of the US Constitution in pi may be "far", but it's not the "far" we have in our fuzzy human brains where things sort of logarithmically just pile together until there's no meaningful difference to us humans between the 1,839,837,237,938,739,837,954th position of pi and the 1,839,827,237,938,739,837,954th position. But a compression algorithm off by that much is useless; indeed, even being off by one digit (in your choice of base) is going to produce garbage. So it's not just about being able to reach "far", it's about being able to reach far and precisely. It doesn't matter how you arrange the possible 2^64 locations a machine word can point to; specify it as 1,739,837,237,938,739,837,954 + the binary as a 64-bit int for all it matters. That reaches "far" into pi. But you won't find anything useful enough to pay off the 64bits you spent getting there.
(I mean, you want to reach far into Pi, I give you "Go BusyBeaver(64-bit int) digits into Pi". That's reaching in pretty darned far. But it's still useless as a compression algorithm, even with the mighty power of BusyBeaver there.)
(Amusingly, at BB(42) digits into Pi, you get "42" as the next digits, proving that 42 really is the answer. Prove me wrong!)
An IEEE 754 double type has a range from about 10^−324 to about 10^308. You start losing precision once you exceed 2^51, but beyond that it should be possible to add an extra byte or a few as an offset.
So a handful of bytes would get us addresses far in excess of the current storage capacity of the internet (estimated around 10^24 bytes a few years ago) ... if only pi were certain to contain every possible number sequence up to some arbitrary length (which last I heard isn't known yet) and if we had some reasonable way to search and index all of that content (which we don't).
People complain about that.
But . . . why do we have athletic competitions? It's not like it is useful for anything at all.
Or Reality TV for that matter.
The decompression just takes a while.
However, there are compact formulas which can generate pi to arbitrary precision, trading off compute time with space. So it;'s effectively compressible, I guess this relates to Kolomogorov complexity in some way.
Uhm no, you mean, approximately π * 10^13.
They perform the same computation on every node, there's no way to split up the work between nodes