This is great but it's stuck checking if I'm a bot... i get this in the console:
```
auto/:1 The resource https://challenges.cloudflare.com/cdn-cgi/challenge-platform... was preloaded using link preload but not used within a few seconds from the window's load event. Please make sure it has an appropriate `as` value and it is preloaded intentionally.
```
If you're an American abroad you can't even use taxact.com or payusatax.com to file your taxes (which you have to because we're the special ones with citizenship-based taxation) because they block every non-American IP, even western Europe. Intuit surprisingly comes out as the less braindead one here.
Sorry, CAPTCHA caused issues. My other project got a bot attack, so I have added CAPTCHA to all my projects. I didn't realize captcha was causing problems to users
I like the concept. A similar idea that is no longer on the web was “shady url”. It would make links that looked like http:// shadyurl.com/nader-for-president.exe
I had one that would turn the urls into wild news stories. I had news-sounding domains like nyeveningpost.com and an auto-generation of bogus stubs such as nyeveningpost.com/breaking/mit-demonstrates-time-travel
Then it would either redirect humans like a normal service or serve a page with meta tags to the crawlers so the card info on social media would have a thumbnail saying "breaking news" and a markov generated caption such as "Earlier today researchers successfully demonstrated time travel at MIT" with the stub matching the title just to increase the chaos.
Ran it a couple of years. Not only did nobody use it but the response was universally discouraging and negative.
Lesson: People enjoy facsimiles of things they find repulsive when it becomes too real. All things have an uncanny valley. It's why people, for example, don't go to butcher shops and pick up animal organs for Halloween decorations.
I find the uncanny valley to be a wonderful artistic experience, like a psychological rollercoaster where there's always something new. But that's a very niche response
They found it confusing, pointless, stupid, annoying, irritating - you know, all the great productive feedback that awesome people give over the web.
One of my problems is I'm quick to abandon my efforts and discount any faith I have in a project and I do so subconsciously. I'm the opposite of stubborn: a sabotaging level of flexible.
Part of behavior change I think is identifying when emotional decisions happen without any intellectual agency.
Figuring out when I'm pivoting without realizing it is a key way to improve myself.
I don't understand how you're supposed to use this. How do you pass the bot check? There's just a button that says "checking if you're a bot..." and clicking it does nothing.
On my phone it takes about 10-15 seconds before the bottom enabled, and I didnt click it; i assume their bot prevention is literally just timing them out
A few days after the above exchange, the Wayback Machine was back online and I ran the site’s URL through it. The only archived version was from October 9, 2024.
It just so happens that I am working on a tool that ends up expanding hacker news urls.
The problem I was solving was that of managing lots of replies to a successful post.
I allow you to replace the "ycombinator" in a hacker news item url with "gipety" so this current thread would point to:
Most of all, I love the fact that, unlike URL shorteners, there's no need to maintain a database of redirects.
I do wonder what the actual encoding scheme is, and how robust it is to lopping off chunks of the URL, since there's presumably lots of room for redundancy...
After "checking my browser" I try to put in the URL I want to shorten, I click the button, and nothing happens. The URL is hardfault.life, for what it's worth.
Yeah, not working for me. The button says "Checking if you're a bot..." and nothing happening. Console shows a warning regarding "The resource at “https://challenges.cloudflare.com/..."
The challenge with scaling a url _shortener_ is that multiple urls might end up with the same short url. That presents a scaling challenge where you have to deliberately design a coordination framework across your set of machines, which introduces coordination, a DB, prefixes, and all your favorite answers to the interview question de-jour of the late 2010s.
With a URL _lengthener_ though, you don't need it at all. The sheer amount of possible outcomes means that the odds of ever getting two of the same is infinitesimally tiny.
Yeah, you could just have a lookup table that's ASCII characters to "long phrases," and encode the URL that way. Have a bunch of nonsense phrases per character and select them randomly, but as long as the lengthened URL fully encodes the destination URL, there's really no scaling problems to be had. You could even do the whole thing in client-side Javascript if you went that route, purely static site.
You'd need "a website" to redirect things, but if you've fully encoded the URL in the lengthened version, that website need not have any server side components. It could serve things that are handled purely client side.
As a trivial example, consider a URL base64-ifier. You enter the URL, it spits out base64urlifier.example/?base64=[encoded URL] - this is trivially done in Javascript based on the inputs. To redirect, all you need to do is go to that URL, and the Javascript reads the query parameter, de-base64s it, and redirects you there. No need for anything server side.
If you designed the service like this (which you have more than enough entropy for in the lengthening side of things - encoding 200 bytes of URL in 5 bytes of shortened link is hard, encoding 200 bytes of URL in 4kb of URL is easy), you wouldn't have any server side components beyond "serving some HTML and Javascript." Put it in a static file host, use the free tier of Cloudflare, and you can scale basically infinitely without any actual server load (if your service is barely used, hits to the static host backend are cheap, and if it's being used heavily, it's always in cache so never hits the backend).
There's no reason every web service needs a webserver, database, and anti-bot services.
Yeah because in theory you could just append the same gibberish string to all the URLs and technically the URL will have been lengthened. Technically you are already lengthening the URL by adding it on top of your domain. Maybe you can base64 it to lengthen it even a bit more and hide the obvious fact that you just added it as a path on top of your domain.
That's only a challenge for a url shortener if it's going to need to scale to an enormous number of users. I think that's a good example of one of those "leave it for when you can't just make your one-machine more powerful or optimise your code more" situations.
(I've let the page sit for a minute or so, and it hasn't concluded that I am not a bot yet, but also, I'm aware I look weird - Firefox, with Javascript JIT disabled, with no GPU acceleration)
I don't know what it's using on the backend, but it doesn't seem to pass for me, and doesn't give me the usual option to pick a baby chicken from a baby duck to prove I'm human.
I don't have WebGL support, so I can't use a URL lengthener, because the bot checker appears to crash shortly after. Someone stop this timeline, I want to get off.
I also disable WebGL. This alone breaks Turnstile. Also helps avoid websites that are "user-hostile". If a website considers curl or Firefox suspicious, then it's not worth proving humanity. I will let them continue calling me a robot in machine-translated Japanese...
The worst is probably hCaptcha. It asks up to 10 machine-translated questions involving machine-generated images to prove the user is not a machine. Something about this is funny to me.
Yeah, mine just says "Checking if you're a bot..." indefinitely. OpenBSD, Firefox, Chromium. Cloudflare sometimes blocks me from sites because, well, reasons - it doesn't actually say why, but I suspect the whole "weird OS + strict privacy settings in Firefox", because the same sites load just fine on my other machines. /shrug
And I have friends who would appreciate such things. Just, ideally, with something that absurd going to as short a site as possible. My gripe is that my browser is apparently too-bot-like for something serving a tiny number of requests.