This reminds me of my unary directory hierarchy: a segment in my test suite consisting entirely of unary newline files and subdirectories, i.e. with filenames '\n\n...\n' for some number of newlines. I use it to test if shell script authors understand how to treat filenames correctly.
Funny yes, but any shell script I write I'd be happy for it to fail (gracefully) this test. Unless you can convince me there is a requirement for something as convulted as this?
When you are writing shell scripts to be deployed to systems you do not control, you are required to handle filenames you do not control; i.e. all of them. I don't care about convincing you, I did not write the tests for you in particular.
Some basic detective work led me to the source code for the URL generation [1].
We can all stop reverse engineering the algorithm now: it appears that it uses a random generator to come up with the I's and l's.
So, I think this hilarious, but link shorteners have to be stable, right? What happens when link shorteners go offline? Is there a way of decoding shortened links without the server?
"So, I think this hilarious, but link shorteners have to be stable, right?"
Yes, a link shortener does need to be stable but luckily they are extremely simple to operate.
You should be able to run a very popular link shortener on a leftover 1U with very low associated bandwidth and power costs. It's not unrealistic for a person, or group, to commit to this indefinitely.
Thinking about it that is absolutely possible, but only for you personally. A site could store your destination as local storage and translate through a service worker (which runs even if the site is offline). Half a page of JS would do it. Not sure if it is useful though?
All link shorteners, even those that use a readable URL, keep the target URL in a database. It's not possible to decode any of them without the service being available.
The novelty here is that the service deliberately makes itself unusable for comedic effect