I'd just chalk that up to DownDetector having a false positive. They aren't directly looking at those services but rely on sources such as users checking Downdetector and reporting problems, sentiment analysis of social media, etc.
10 years ago we bought every Minitel in existence from eBay and connected them into a giant X.25 network spanning the globe. Unfortunately, one EPROM didn't have a cover over its window and bright sunlight in Madrid caused firmware corruption.
> 10 years ago we bought every Minitel in existence from eBay and connected them into a giant X.25 network spanning the globe. Unfortunately, one EPROM didn't have a cover over its window and bright sunlight in Madrid caused firmware corruption.
We were seeing some pages with cookies incorrectly getting cached, causing users to get logged in as other users. We quickly disabled cache on the dashboard, is this related to that?
On the webdev side—what can be done as extra defense-in-depth step to guard against this kind of issue? Unrelated to Cloudflare I feel like it is a common issue that crops up on even massive sites quite often. Is there some sort of secondary check / content decryption that could be required on the client-side to contain session cookie crossover?
Outside of HTTPS, typically it would be tying the session cookie to the IP address or netblock but because it's Cloudflare IPs, for regular browser navigation requests I don't think there's anything that can be done?
It is expected result yes. Not disagreeing with you on that. I am simply saying that CF is not "hosting" the site. It's the middle man. The host is hidden.
I would have liked to read that. Instead here's what I see:
> Checking if the site connection is secure
> Enable JavaScript and cookies to continue
> blog.cloudflare.com needs to review the security of your connection before
proceeding.
What does it mean to "review the security" of my connection?
Wouldn't that be my business? (Feel free to review the security of
your connection by all means) :)
Why would that "need" running JavaScript here on my browser (which I
don't for fairly obvious security reasons) Other websites seem to have
no problem delivering basic content without that.
Also, no thank-you to cookies. We're not entering into a "session"
relationship here, I merely wanted to read the document you advertised
at the URL.
This is not specifically about Cloudflare’s “challenges“/etc, but —
The reality of operating a big site/service on the internet in 2022 is that it’s sometimes necessary to use methods that annoy a few people (with very non-standard browser settings) in order to protect the service as a whole from a million bots trying to attack it at any given time.
This sounds like a very plausible argument. I've heard many of the
arguments and don't dispute the threat model to something like
Cloudflare.
And yet something about it still doesn't add up.
It turns power into a weakness.
How is it that much smaller sites - still able to serve something as
simple as a plain-text blog to millions of users from a modest rack
shack - operate perfectly well without any impediment?
Wouldn't an operation with all the power, might and money of
Cloudflare be able to do a better job and still maintain the QoS
(accessibility, interoperability etc) as Basement Bob with her
Raspberry Pi?
Remember, all I want to do here is read a static web page of (I
guess) less than 1000 words.
I'll take a punt: if "defending against millions of bots" is
Cloudflare's business offering, then being able to serve a static site
off a Raspberry Pi doesn't look good :)
I think the parent comment's claim is that serving a CAPTCHA page to potential attackers may actually be more resource intensive than serving a lightweight page that has the actual content on it.
Cool. That is certainly your choice. It is also the choice of the website operators whose sites you try to visit to block your traffic since you won't opt in to their security precautions.
Your house isn't also accessible by anyone in the world, anywhere, at any time. People who complain about this stuff conveniently ignore that a massive amount of traffic to public websites is malicious and automated in nature. It's not crazy that website operators want to block that stuff. They would rather block the .0001% of people who choose to block javascript than expose their site to the junk. You can rage at Cloudflare all you want but if it wasn't a service people wanted then they wouldn't be offering it.
The measured number of browsers blocking JavaScript varies
between about 0.7% and 4% with an estimated global average
hovering around 1.5%. Here are some sources [1,2,3].
That's how Pantheon uses the code, not how Cloudflare uses it. The wikipedia article you linked has a whole section dedicated to Cloudflare's use of unofficial error codes.
Specifically, it doesn't mean much in particular for Cloudflare other than "check the other error code we returned elsewhere in the response to see what the actual issue is"
There's no way it's DNS
It was DNS