
Cloudflare is turning off the internet for me - dijit
https://blog.dijit.sh/cloudflare-is-turning-off-the-internet-for-me
======
RyanK24
It appears that you may have made some modifications to your user agent
string. If you revert your user agent to the one provided by default by your
browser vendor everything will be fine.

~~~
tomc1985
Why is it that something as malleable as a user-agent string trips these kinds
of sensors?

If I were to write a bot, copying current browsers' user-agents is literally
the _first_ thing I'd do

~~~
leokennis
It really hits home the point of how shitty the web has become. Ad companies
and malware distributors come up with bad and worse ways to interfere with my
browsing, and the “good guys” need to match with increasingly invasive and
fragile anti measures.

Sort of like having to take of your shoes when you board a plane. If that’s
what it takes, isn’t it just better to stay home?

~~~
oarsinsync
> Sort of like having to take of your shoes when you board a plane. If that’s
> what it takes, isn’t it just better to stay home?

Removal of shoes, 'naked' full body scanners, these are all terrible, and I
tell myself every time it isn't worth the hassle.

The reality is that as much as I hate it, I'm still flying every other week.

I'm also on the Internet daily. I don't see that changing.

~~~
tomc1985
Honestly what is the point of user-agent at all if it needs to be set to some
changing, magical incantation in order for a browser (or any other agent) to
be functional?

I hate the direction the internet and tech is going, and I hate even more that
I'm seemingly powerless to do anything about it

------
Liquix
The problem presented by services like ReCaptcha and Cloudflare is a tough nut
to crack.

They're silently embedded in a huge portion of modern websites, and the
average user will never even know about them.

But it seems to be way too easy for them to blanket-ban or serve an absurd
amount of captchas to powerusers, linux gurus, privacy geeks, or anyone with
the wrong combination of browser+addons. And the failures (as in this case)
are often silent, cryptic, un-fixable from the user end, and can prevent us
from accessing massive swaths of the internet. Any thoughts surrounding this
conundrum?

Solutions:

1\. Everyone stops using ReCaptcha/Cloudflare.

\- Never going to happen. They dominate the market because they are useful,
well-made services.

2\. Launch a competing product that accomplishes the same thing.

\- Good luck competing with these giants. Also, how would your implementation
differ to solve this issue?

3\. Powerusers and tech nerds must conform to 'normal' browser configurations
and disable privacy addons in order to enjoy the internet with 'normal' users.

\- Two steps backwards in every conceivable way. The giants gain more
invisible power and powerusers suffer decreased productivity/privacy. Not
going to happen.

~~~
hombre_fatal
Yeah, you can't really talk about downsides of Recaptcha/Cloudflare without
also acknowledging the extreme amount of malicious actors and abuse on the
internet.

We're in a "this is why we can't have nice things" predicament and you have
malicious actors to thank for that, yet most people on HN only seem capable of
attacking the few affordable solutions to that problem.

I'm even down with the theory that Cloudflare is a US government outfit,
that's the only way I can wrap my head around such a generous free tier. But
at what point does it worry you that the internet has so many fundamental
issues that people willingly centralize behind such a large behemoth? How many
options do I have when a kid is holding my forum hostage with a $5 booter
service?

It's easy to shit on everything. Let's hear some real solutions.

~~~
danShumway
> Let's hear some real solutions.

It's by no means a full solution (there likely is no single full solution),
and it may even be a bad solution -- but lately I've been trying to think
about what the Internet would look like if we didn't have a massive arbitrage
potential around server requests.

Part of the reason why everyone is trying to detect bots is because bots will
very, very rapidly eat up your bandwidth and CPU time. We're used to offering
our bandwidth/CPU for free to humans and either swallowing the cost if we're
running a free service, or making up the cost in an adjacent way (ads,
subscriptions, etc...). It's not bots that are the problem. It's that when
someone asks our servers to do something, we do it for free. Bots are just a
big category we can ban to make that problem smaller.

In many (but not all) cases, we shouldn't care about bots, and the only reason
we do is because our systems aren't scalable to that level.

So I've been wondering lately what a server-defined per-pageload, or even per-
request fee would look like on the Internet, maybe one that scaled as traffic
got heavier or lighter and that was backed by a payment system that wasn't a
complete rubbish privacy-disrespecting dumpster fire.

My immediate thought is, "well, everything would be expensive and
inaccessible." But, the costs don't change. You still have to pay server costs
today. Businesses today still need to make that money somehow. There are
almost certainly downsides (all our current payment systems are horrible), but
I wonder if it's more or less efficient overall to just be upfront about
costs.

Imagine if I could put up a blog on a cloud service anywhere with scalable
infrastructure. Then a post goes temporarily viral. Imagine if my server could
detect it was under heavy load, detect that it was getting hit by bad actors,
automatically increase the prices of requests by a fraction of a cent to
compensate, and then automatically ask my provider to scale up my resources
without costing me any extra money?

For a static site, suddenly I don't need to care if people or bots are
hammering it, I don't need to care about anything except whether each
visitor/bot is paying for the tiny amount of hosting costs they're hoisting on
me. If bad actors start pushing traffic my way, I don't need to ban them. I
just force them to pay for themselves.

~~~
scottmotte
> automatically increase the prices of requests by a fraction of a cent to
> compensate

Great concept.

CPU, bandwidth, electricity, it's all just energy. And to a significant
degree, money is just energy stored. I generate energy with my own work, store
it in the form of money, and then transfer that energy to someone else, maybe
to heat my home or cook me a meal.

Before money, I had to barter for those things. Maybe conceptually the
internet is in a similar state at the moment. It doesn't have 'money'. Why
can't I put CPUs in my wallet and then spend them? And why can't I charge
visitors to my site by the CPUs they are costing me?

Instead, I have to, in a way, barter. For example, maybe I use ad revenue to
earn my income, so I generate all this content, I barter that to the search
engines, which barter with the advertisers, which barter with me, and I barter
back to security guards to protect me from 'bad' actor bots. I'd really just
like to receive CPU and bandwidth payments from them.

~~~
perl4ever
Isn't the reason we are freed from barter in daily life is because the
government is intimately involved in the financial/banking system, and
regulates it and issues money and so on? Maybe we continue to struggle with
the internet because it started out unregulated and has never really
transcended that because people insist on thinking freedom is best for
commerce without appreciating the nuances.

~~~
rashkov
There are alternatives to that. For all of the hype and vaporware of the
cryptocurrency movement, the idea of digital-native programmable internet
money is a powerful one. I’m personally excited by the idea of involving
currency at the protocol level and having it interact naturally over tcp/ip
and http. There is an alternative to ads if we can make it work.

------
psz
Because the author using a niche browser (QT wrapped in an app), this might be
Browser Integrity Check. It's purpose is to block non-browser like behaviour
common for spammers/malware.

That would also explain switching to Chrome fixes the issue.

[https://support.cloudflare.com/hc/en-
us/articles/200170086-U...](https://support.cloudflare.com/hc/en-
us/articles/200170086-Understanding-the-Cloudflare-Browser-Integrity-Check)

~~~
szczepano
So if your browser agent is not firefox, chrome, safari internet can stop
working. What a great side feature.

~~~
JakeTheAndroid
Yeah, because most custom browsers are malicious. They have the data to prove
it. This isn't a side feature, it's a direct feature that is 100% intentional.
They maintain a backend whitelist of known "good" user-agents. Curl is on that
list and there are a few others outside of the big players.

Most people building custom browsers are doing it to do something Chrome would
disallow. One instance would only supporting one, weak-ish cipher forcing TLS
to use a predictable cipher instead of choosing the best available encryption
for transit. While I agree some people have cool browser projects that would
be nice to use, it's a side effect of bad actors abusing the system. Most of
the annoying parts of Cloudflare exist because bad actors have abused the
system.

~~~
vageli
Any sufficiently bad actor will already modify their user agent. Who is this
really stopping?

~~~
shadowgovt
Bad actors who are bad at being bad actors, which is actually the bulk of bad
actors.

It's maddening, but it's true. I've seen tale of people having to modify
resource auto-generators that created URLs with hexadecimal identifiers in
them because the sequence "ad" in a URL would trip ad-blocking browser
plugins. You might ask yourself "how many ad companies worth their salt have
'ad' in the URL path?" and the answer is "The ones who are worth their salt
might not, but the ones who are terrible do, and they're probably terrible at
other things too, like letting malware on their network."

~~~
incompatible
There's somebody who can build a custom browser but can't figure out how to
change the user agent string?

~~~
shadowgovt
They're called "script kiddies" and the trick is: they don't build the
browser, they download a kit someone else built that has a user agent in it
and use it for whatever purpose they intend to.

I went to school at a place that had a policy of soft-blocking network access
for any machine that a portscan detected had TCP or UDP 12345 opened, because
Back Orifice defaults to that port and people who built trojan horses to allow
remote access didn't change the default. It caught a reasonable number of
owned machines every year.

Don't overestimate criminals; if most were good at being criminals, they could
be successful in society without having to break the law. ;)

------
Dunedan
There is another side effect of visiting sites served by CloudFlare with
disabled JavaScript: Email addresses (or any other words separated by an @)
appear as "[email protected]" instead. Take this blog post from CloudFlare
themselves for example: [https://blog.cloudflare.com/serverless-performance-
compariso...](https://blog.cloudflare.com/serverless-performance-comparison-
workers-lambda/) That's just ridiculous.

~~~
partiallypro
How is it ridiculous? It's called email obfuscation (it can be disabled within
Cloudflare) it's to stop spam bots from ripping email addresses from websites
and adding them to mailing lists.

~~~
tcd
Because their regex is crap and if you have @twitterHandle or something with a
legitimate "@" you just see the obfuscated version.

It's laughably adorable to think it's actually solving a problem or helping in
any way, the 'bad actors' it's trying to prevent probably have a work around
anyway.

~~~
vageli
> It's laughably adorable to think it's actually solving a problem or helping
> in any way, the 'bad actors' it's trying to prevent probably have a work
> around anyway.

They do—changing your user agent is trivial.

~~~
sp332
Why would I want to send an incorrect user agent? There's no point having one
if it's just going to lie anyway.

~~~
JohnFen
I agree. I wish browsers would just stop sending a user agent string entirely.
I don't think that I've sent an honest one in over a decade.

~~~
sp332
[https://news.ycombinator.com/item?id=22044057](https://news.ycombinator.com/item?id=22044057)

------
StavrosK
I don't see anyone here mentioning this, so I will. CloudFlare supports
PrivacyPass, a third-party, privacy-preserving way to do proof-of-work for
websites. Basically, you do some small amount of CPU work and get 30 tickets
your browser can transparently redeem for access to a site.

That's a pretty clever way to both deter bad actors and ensure legitimate
users get uninterrupted access to websites.

~~~
Ajedi32
AFAIK the Privacy Pass protocol has nothing to do with proof-of-work. It's
just a way to allow users to solve a CAPTCHA once and re-use that single
solution across multiple sites without jeopardizing privacy. Reduces the
number of CAPTCHAs you see, but doesn't eliminate them entirely.

What you're talking about is more like what Hashcash does, where it
essentially replaces CAPTCHAs with a cryptocurrency miner, such that bots
become more expensive to run due to the amount of energy they consume. The
downside is it's not great for battery life for regular users either.

~~~
Fice
Captcha is also a kind of proof-of-work, and a very nasty one, where users are
made to do the work instead of computers.

------
cronix
I've been noticing a larger and larger portion of sites in google results have
been taken over by _something_. They all act the same way, and act similar to
cloudfares checks. You get a big green dot on the screen first and then a
bunch of redirects. I've not read anything about it, but probably 1/15 of my
google result clicks end up there. How does google not know these sites have
been taken over and keep them on the first page of search results? It's
getting tiring because they also hijack your history, so you can't just hit
"back" and you lose everything you've been searching through. You can't just
click "back" to the results page and go to next result...you have to start
search over and remember exact terms you used "that last search"

~~~
BenjiWiebe
I have a little tip for you; right-click your back arrow, and choose your
search page from the list.

~~~
htfu
At least on iOS Safari there's this bug (feels like it's always been there,
maybe content blocker related?) where sometimes a search simply gets eaten.
The browser somehow thinks you went from nowhere straight to the page you're
on, even though you went past a Google results page.

I assume it's somehow redirect-related and that's why these sites tend to
trigger it.

~~~
why-oh-why
I’m thinking of two situations:

\- you tapped the “Siri suggestion” result which completely skips the SERP. I
hate that “back” doesn’t bring you back to what I typed in the search/URL bar

\- I regularly visit The Verge, open a story and then the back button doesn’t
take me to the homepage but to the page before it. I blame their crappy
JavaScript but maybe we’re experiencing the same thing.

~~~
htfu
Nope. It's as if that happens, though going past a regular Google search
results page, which is dropped (hence along with the search itself) from
history.

~~~
why-oh-why
Nope what? I described that same thing on The Verge.

------
gok
I really wish you could just directly pay for services. A hundredth of a cent
per visit would make most abuse un-profitable while still allowing very
affordable procrastination.

~~~
sneak
This would make search engines impossible, sadly.

~~~
Dylan16807
Why do you think that?

------
KingMachiavelli
FYI, the new/current version Qutebrowser just added a built in list of
websites that require a "Chrome/chromium" UA - it also added logic so that
parts of the UA can be auto updated.

Shameless shill: Qutebrowser is by far the best browser I've every used. The
half measure of using addons (even powerful ones like Pentadactyl) cannot be
compared to having a browser that is power user friendly in every aspect, from
config to UI. If a site doesn't work well with it then I'm probably not going
to use that site. If I can move away from Google then I can find your
article/post somewhere else.

~~~
dijit
This mirrors my experience and opinions. However I miss certain powerful
adblock features and I would prefer more/easy control over javascript that
gets rendered since JS use is by far the largest consumer of energy on my
laptop.

------
tmoravec
Cloudflare is also making TOR quite useless.

~~~
asdf21
I don't know the specifics on this, could you elaborate?

~~~
tmoravec
Half the websites I visit give me the CloudFlare screen like

    
    
        One more step
        Please complete the security check to access <whatever>.
    

Followed by either endless stream of ReCaptchas or one completely impossible.

~~~
JakeTheAndroid
if captcha loops are still an issue, you should write in. JGC takes that stuff
seriously. You're more likely to see a IP block, because most site owners do
not whitelist TOR endpoints or specifically block them because it's mostly
abuse.

~~~
lucb1e
I'm guessing JGC is some dude that works at CF, but where should one write in?
You're literally looking at a Google CAPTCHA, you can't go to the contact page
because that's behind this proxy that won't let you pass.

~~~
JakeTheAndroid
JGC is fairly popular on here and is CTO at Cloudflare. You can simply write
into support@cloudflare.com, those tickets are forwarded to engineering teams.

~~~
lucb1e
Support options would be good info to put on that CAPTCHA page instead of
having to find that somewhere deep down in an HN thread.

Wasn't HN also behind CloudFlare? Looks like that changed, but maybe it will
be again in the future.

~~~
JakeTheAndroid
I do believe that there was a point where HN was using CF, but that hasn't
been true for a while if memory serves.

As for the support@ not being on those error page; decent feature request. I
image the reason they want to avoid that is many of these errors are delivered
at request of the site owner or related to the site not working (404s, 503s,
IP firewall blocks, etc) so they do not want to funnel people into Cloudflare
support for issues that are not specific to Cloudflare.

Determining which errors are the site owners responsibility and which errors
are Cloudflares responsibility can be quite tough.

~~~
lucb1e
It's not rocket science, they just don't _want_ to solve the problem.

"many of these errors are delivered at request of the site owner" For those,
put the site owner's contact method there. Even a physical mailing address,
fine by me, I'll send a letter (something a spammer would not do) if it's
important enough to me to do so.

"or related to the site not working (404s, 503s" those pages don't deliver a
Google CAPTCHA or don't say "You have been blocked". If they can determine
whether a page should have a captcha and/or that text, then that if statement
can also include showing contact info.

~~~
JakeTheAndroid
I think the truth is somewhere in the middle here. Yes, Cloudflare could do a
bit more to predict this, but I don't think it's as trivial as you make it.
The routing between you to a site through Cloudflare includes a lot of complex
interactions.

The captcha page, sure, maybe. I can't think off the top of my head what would
happen on that page that _wouldn 't_ be related to Cloudflare/reCaptcha. I
yielded that is a decent feature request. But plenty of actual interstitial
pages served by Cloudflare aren't necessarily caused by Cloudflare. Like the
fact you get a Captcha at all isn't Cloudflares choice most of the time, it's
the site owners. And having support@cloudflare.com on that page would 100%
cause people to write in saying they don't want to see captchas. That's not
the appropriate party to reach out to requesting to stop seeing captchas for a
specific site. Now, SOMETIMES it's an automated incident because of your IP,
so then you DO want to reach out to Cloudflare.

Same with 500 series errors. Sometimes it's the website not responding, but
sometimes it's Cloudflare not interacting properly.

So yeah, I think the truth of the matter is in the middle here. In terms of
priorities, I have no doubt this is pretty low on their list. Why would it be
any higher when they serve the technical purpose they were created for? The
rest of that is QoL with minimal impact on customers compared to many other
issues that go wrong with the network that have considerable impact on
customers and visitors.

------
KaoruAoiShiho
Cloudflare blocks an enormous number of bad actors for me, so I quite like it.

~~~
lucb1e
I'm curious about the specifics because through my and my colleagues'
simulation of "bad actors" (ethical hacking) I've never once gotten stopped by
CF.

I thought the point was anti-DDoS by just proxying your traffic through
someone with bigger pipes. That they do TLS offloading to filter n-days like
Heartbleed helps as well of course, but those are super rare events and it
sounds like what you mean is ongoing.

What kind of bad actors do you mean, and what kind of sites? Don't have to
mention the domain or anything, just that it makes a difference whether it's a
web shop (financial risk I guess), more like a forum (spammer / hater risk),
or something else.

~~~
judge2020
Generally, the DDOS protection even at layers 3 and 4 is a blessing. For a
layer 3/4 attack, all a malicious actor really needs is a bunch of average
internet connections anywhere; and depending on your server hardware, it might
be inexpensive to keep it down since those IPs are cheap. CF just drops all
packets from these "literally 99% of traffic is malware" IPs on the network
layer.

But a layer 7 DDOS attack, when going through Cloudflare, means the malicious
actor needs to have IP addresses that are at least not complete trash in terms
of IP reputation. Getting access to a botnet and access to these IP addresses
isn't exactly prohibitively expensive, but it's a much larger barrier to
entry.

It's even harder to get taken down by a layer 7 DDOS attack on Cloudflare if
you use "im under attack" mode, assuming your attacker isn't paying even more
for the botnet to run something like Chromium or node to hit your website.

Finally, while Cloudflare doesn't actively do this for small-scale DDOS
attacks (since it might just be a spike of users), they do have Gatebot for
the larger scale ones [https://blog.cloudflare.com/meet-gatebot-a-bot-that-
allows-u...](https://blog.cloudflare.com/meet-gatebot-a-bot-that-allows-us-to-
sleep/).

------
edf13
There is a stage where "good" companies get so big that they can't really be
"good" (For all)...

Cloudfare has passed that point a while back. They have so many policies,
departments, shareholders and government departments to please that it is now
impossible for them to be a truly good force the open internet.

------
therealmarv
We need a website which tracks Cloudflare's captcha and blocking behaviour! I
live in Cyprus (EU part!) and I witnessed more and more sites need now this
stupid Captchas and I installed actually "Privacy Pass" which is also not
working 100%. Why is this (EVIL!) company destroying my internet experience?
This is a relatively new and recent experience for me... something has changed
on Cloudflare's captcha requirements. Before that I've only witnessed this
behaviour in countries like e.g. Philippines.

------
crypt1d
I suspect Cloudflare is doing some form of 'fingerprinting' to flag potential
attacks. Fingerprinting is probably based on things like IP, user agent, js
being enabled, etc. In this case it seems that Cloudflare only banned a
specific user agent with js_enabled=no.

Obviously this is all just an educated guess, since I've worked on building
scrapers for cloudflare-protected websites.

~~~
annoyingnoob
Came to say the same thing.... Try changing the user-agent the browser sends
to match the user agent that Chrome sends.

------
open-source-ux
Also related: depending on where in the world you reside, you may be more
likely to encounter Cloudflare blocking sites or simply slowing the retrieval
of websites (as it scrutinises traffic to those sites). This may be more
likely to occur to you if you access the web outside of North America and
Western Europe.

 _CloudFlare is ruining the internet for me (2016)_

[https://www.slashgeek.net/2016/05/17/cloudflare-is-
ruining-t...](https://www.slashgeek.net/2016/05/17/cloudflare-is-ruining-the-
internet-for-me/)

And the ensuing Hacker News discussion a few months ago:

[https://news.ycombinator.com/item?id=21169798](https://news.ycombinator.com/item?id=21169798)

------
phillipseamore
To be fair, site owners are in control of those settings.

~~~
falcolas
What are the defaults of those settings? What percentage of site owners change
the defaults?

It's kind of like saying "it's your own fault you didn't de-select the "track
everything I do" checkbox on our privacy page".

------
Animats
Somewhat to my surprise, my SiteTruth system is allowed to read
"bakadesuyo.com", the site with which the article author had a problem.
Sitetruth uses a user-agent like "Sitetruth rating system", and makes no
attempt to pretend to be a browser. Cloudflare let it through. No capcha. No
error messages.

Can't conclude much from this; Sitetruth has been reading sites openly for
years in a well defined way from a well known IP address, examining them for
ownership info about once a month. Although it looks at millions of sites, it
never hits any one site very often. From Cloudflare's perspective, that's
harmless.

------
stopads
I also refuse to complete captchas because I'm not interested in giving a
trillion dollar company my free labor to help train their models.

I'm really close to just defaulting JavaScript off entirely, the web is
becoming so much worse by the day.

~~~
pferde
I've had js turned off by default for many years now, and don't feel like I'm
missing out on anything important. Your mileage, of course, may vary.

~~~
dijit
You're a stronger person than me.

I keep trying to live without JS but so little of the internet works.

* Gitlab/Github (obviously)

* Google maps (obviously)

* Linkedin... (uh... less obviously)

* Outlook

* Infoq.com

* Rust docs

* Google Cloud Docs

etc;

A lot of the internet is butchered without JS.

I really want a way of just blocking third-party JS (IE; the site can deliver
JS, but not anything it tries to import unless whitelisted). But that seems to
be hard with qutebrowser.

FWIW uMatrix apparently has a method doing this.

~~~
pferde
Perhaps I did not express it clearly, but I do not browse WITHOUT ANY
JAVASCRIPT EVER. I browse with js disabled by default, but enable it (using
excellent uMatrix Firefox addon) for certain sites that I trust. Although even
then I try to find the least possible amount of uMatrix permissions that
enables the functionality I need from a site.

That said, I do not use many of the obvious mainstream sites - e.g. I ditched
Github like dirty socks the moment Microsoft grabbed them.

But yes, modern web (not the Internet, mind you) is very damaged, and I fear
it will take decades to fix the damage, once (I hope) smarter people take the
reins after high-visibility security and privacy incidents become more and
more frequent, and, well, more visible to general public.

------
phantom_oracle
CF appears to be looking at the UA-agent and deciding to block based on that.

I'm guessing it's very basic checking because deep browser-fingerprinting is
supposed to against the law in some countries (I stand to correction on this
statement).

I'm not personally a fan of CF because of the amount of data they can
potentially obtain(or do), but there's a lot of crap out there and their
firewall is robust enough to protect Johns Cowboy store from contributing to
some dudes Monero mining botnet.

------
hn_lurker1
I think the site went down. Here's the archive:
[http://archive.is/dVfYE](http://archive.is/dVfYE)

~~~
dijit
Ironically; Cloudflare would be quite useful for me right now. :)

------
big_chungus
ReCaptcha is a necessary evil, but there's something that needs to change: the
internet-wide rate limit. If I try to scrape a site and start getting
captchas, when I go back and try to browse a completely different site, I'm
suddenly trying to drag the points to outline a plane for ten minutes. Keep
track of reputation on a site-specific basis.

------
sct202
Random web hosts also have sloppy antibot 'AI's'. Siteground's is the most
annoying because it for some reason thinks my work's network is a bot, and it
doesn't identify itself unless you reverse image search their little anti-bot
picture.

------
risyachka
So basically the point of the article is this: there are no perfect tools. Not
sure why this article got so much traction.

It's like complaining that airport security checks your bags when you have
gun-shaped objects inside it.

~~~
dependenttypes
More like asking why you are being kicked out of the airport when you have a
bob hairstyle.

------
OrgNet
Google's Recaptcha is a much bigger problem then Cloudflare's service for me
currently, but it could change at any time.

------
xacky
This is the death of alternative browsers it's only a matter of time before
Firefox and friends gets blocked from major websites for "security" reasons
and you will only be able to browse the internet in a browser that enforces
full tracking and ads only. The MPAA and similar organizations is probably
working on this already.

------
asdf21
Is he using a headless browser? Interesting that he won't give specifics..

Edit - Whoops, read too fast.

~~~
dijit
I link the exact browser I use.

It's called qutebrowser, it's powered by QtWebEngine, which is chromium
underneath.

Basically Chromium wrapped in python with vim-keybindings.

~~~
RyanK24
We're currently investigating. Should have more info soon.

------
ycombonator
Cloudflare is the new big brother.

------
m463
The root cause is hammering the site(s) with lots of traffic in a short period
of time.

Reloading the tabs looks like an attack.

One solution might be to close tabs when you're done with them. (Althought
I've been surprised to find that some people routinely keep many many many
tabs open virtually forever)

------
core-questions
It's okay, don't worry. Cloudflare would never use this incredible power for
political purposes...

...oh wait, yes, they have and they will again. We've given them the power to
censor and with the power increasingly comes the onus to do so, to satisfy the
political whims of an elite that doesn't like anything counter to their
narrative.

~~~
bluetidepro
> ...yes, they have and they will again.

I don't doubt this, but do you have more info on what you're referring to on
"they have"? I'm a bit out of the loop on that.

~~~
ecnahc515
There's been a few occurrences of them terminating service for the sites
promoting extremist views, eg: [https://blog.cloudflare.com/terminating-
service-for-8chan/](https://blog.cloudflare.com/terminating-service-
for-8chan/)

I don't entirely disagree with it but it's censorship nonetheless.

~~~
mschuster91
It's not censorship. The Internet is a vast space, someone will host your
stuff.

~~~
kube-system
Universality is not a defining characteristic of censorship. Most censors have
very limited scope.

