Given that this uses multiple Cloudflare services, is anyone very familiar with their pricing? The readme lists a "generous free tier", but I always have to wonder what happens once you fall outside of that tier. I remember that R2 is cheap (one of its distinguishing features), but wonder if any of the other services suddenly become $50 a month or something.
Cloudflare's free tiers are hard-free; they stop working if you exceed the free tier usage (at least for compute; storage is different).
Pages is essentially totally free. R2 is within the same pricing magnitude as S3. D1 is a very new product and afaik is still in beta and doesn't have production pricing yet; so be weary there. Zero Trust is less usage and more per-user, I think $7/mo/user.
> be weary
I've seen this mistake three or four times recently, and it's starting to make me wonder why I keep seeing it so often now, when I didn't before. "Wary" (like "beware") means "cautious"; "weary" (like to "wear" someone out) means "tired."
Putting a quote and response onto one line means the whole thing is a quote. Hope you find this contribution to the conversation helpful as well as for all readers.
R2 storage might be within the same order of magnitude as S3, but once you start serving content to users, it becomes many orders of magnitude cheaper.
They definitely have sensible backout mechanisms for their free tiers (i.e. Pages, they still serve static assets but disable compute/Functions).
Also, their docs are written pretty well, and openly hint at dangerous price traps.
We launched a big presale this year with an e-commerce app hosted on Workers, and we just barely hit the 5$ threshold for the paid plan. Did not use more services (other than DNS/proxy) though, so YMMV.
I've been looking for affordable hosting solution to serve multimedia files, e.g., podcast audio, which is typically the biggest part of cost for hosting.
Cloudflare R2 is an attractive solution. R2 doesn't have egress fee and the free tier includes 10 million read requests/month [1]. Is 10 million a big number? Yes, at least in the podcast industry. If your podcast has 10 million listens per month (or let's be conservative, "only" 1 million listens/month), you could already make big money [2] while still in the free tier of R2! In other words, whenever you'll pay for R2, then you become a very successfully podcaster :)
For other Cloudflare products that we use in microfeed
* Zero Trust (provides logins to admin page) [3]: 50 users for free - how many admins do you need? 50 users enough? :)
* Pages [4]: 100,000 functions requests/month for free. This should be enough for personal/small-business type websites.
* D1 [5]: No pricing info yet. But they'll likely charge by # of reqs/month.
* CDN / Firewall / Cache / DNS etc.: there are other Cloudflare products you can use for free.
Probably the biggest price you'll pay when using microfeed is your time :)
There are (almost) one-click SaaS solutions that save you time, but you'll pay with money.
There are multiple-click self-managed solutions that save you money, but you'll pay with time.
Forestry has been on my radar for a long time but never had a need to use it https://forestry.io/
The big draw for me is it's just Hugo/Gatsby/Jekyll underneath, and the output files can be delivered anywhere that will host static files (CloudFlare pages does this really well, as does Netlify).
This looks fantastic. I just started a podcast and was planning to test publishing with cloudflare since it seems much cheaper than the alternatives (and planning to backup to storj).
Traditional CMS you do the setup and plumbing in the server because you have full control
Static website hosting you prebuild a website and push it up all complied and done but in a standard supported format, do it easily can go to different providers.
There's not much in between, because you can't (easily) write code that is happy to run in two different systems controlled by other people with different plumbing under the hood... It's certainly doable, but the more you generalize the more complicated your code gets... and the less you can benefit from the vendor specific stuff (e.g. removing features that one side does not support ... Or adding code to use it if it's there -- which makes code even more complicated)...
Edit: and to the point of code complexity: if you did this all yourself in an alpha release, you likely would make a company/product like forestry (mentioned elsewhere in this discussion) because you've made something some people would consider somewhat or very valuable
I use middleman[^1] + bulmaCSS + FontAwesome but host on github using the `github.io` domain and upload podcasts to "archive.org"[^2]. I choose this setup to make sure content will survive as much as possible.
ps. You might want to consider a donation to the internet archive, see [^2] - no affiliation, just a happy user.
Nice setup! I've been following Vercel and Cloudflare trying to sync up their v8 isolates runtime: CF is going to be a great option for running these "Edge Style" React apps once stuff stabilizes.
I just use PicoCMS and upload my markdown files through FTP. I don't think there's anything out there that can compete with the simplicity of this solution, not even static site generators.
It uses templates with Twig, and it's very simple PHP so you can host it anywhere.
I once hit the frontpage of a big aggregator from my country and it handled like it was nothing from a very cheap shared hosting.
From my Linux Mint laptop, all it takes is to open my FTP on Thunar (which involves a click in my sidebar), and I'm navigating it like any other file system. Right click, create file, write & format my post, and save the file.
Viola, it's published.
Oh, and you can mix Markdown with HTML and JS in your posts. Less stuff you need to learn.
There are different definitions of simple. I found wordpress simple because I could access the UI to post from any device. I could upload images, save drafts, and preview posts. My static site, basically requires a desktop/laptop PC and to preview I have to build the site. This is far less simple to use even if it's simpler to run
I would argue that using something like Nuxt/Content[0]is even simpler. I create a new markdown file in my website’s local repo, write the content and commit if it’s ready for publishing. No need for the FTP step and version control is build in.
This setup is also completely free since the content lives on GitHub and my static site on render.com (but any static site hosting will work).
And since it’s Nuxt based, it automatically also supports more advanced features such as tagging, advanced queries and filtering.
Looks like free tier for render.com static sites is 100G egress per month. While this is sufficient for a low traffic site, I’d probably want to protect myself by putting Cloudflare (or some other free CDN) in front (yes I realize that render.com has their own CDN network in front, but that’s still wrapped inside the 100G free tier limit it looks like).
There are tons of very cheap hosting providers who are stable and have been around a long time. You could look at buyshared.net which is the webhosting arm of buyvm.net.
What is good solution for hosting a domain anonymously? I want to start blogging but don’t want to do a personal one just yet. I mean anonymous in lightweight general sense, not political activist super security.
WHOIS[1] masking hides your information from the world. It's a service provided by some domain sellers who, obviously, will know those details (given them by you), as well as your financial information as you can't usually lease domains using untraceable transactions.
"Self-hosted on Cloudflare" feels like an oxymoron. I know it's common for people to consider hosting on VPS and the like to be self-hosting, but attached to proprietary services seems to be a step further, I think something being portable between providers has to be a minimum for the term.
I like the idea of having web content you can put up with minimal configuration and cost but we have to draw a line for that definition somewhere.
Most people consider self-hosting as "running service x yourself", which I think i fair considering most people treat self-hosting as an alternative to SaaS offers
No, Cloudflare is a company that offers many products. And the services utilized by the project (Pages, D1, R2) would most likely be classified as PaaS by most.
I think the novel hack here is taking something typically hosted on IaaS or lower (whether that's a VPS, colo, under your desk at home, etc.) and instead running it on PaaS.
No. Self-hosting refers to running a service where you completely own the source code. You can run this on a computer in your room, or a rented VPS, or a serverless hosting option like Cloudflare Pages.
For example, if you run a Nextcloud instance and upload pictures there, then you are self-hosting your photos.
Self-hosting means the ability to host and run the code on your own server (home or webhost). Since Microfeed is built on the cloud services offered by Clouflare, you technically can't run it independently on your own server as you will still have to use the Clouflare services:
microfeed uses Cloudflare Pages to host and run the code, R2 to host and serve media files, D1 to store metadata, and Zero Trust to provide logins to the admin dashboard.
@jacooper is correct that the term has been wrongly used by the creator.
So what do you want to call it then? "Running on a lower level general compute abstraction with provided networking infrastructure" instead of self-hosted?
This isn't directly related to the post, but it's a CMS related question: does anybody know of a simple "static file" CMS that also lets you deploy server side javascript functions? It's a weird use case, so I suspect there may not be. I've solved it by using a static site generator and a nginx reverse proxy to a node server - but it's a lot of complexity for my use case (using stripe to sell stuff, but stripe checkout requires server side session creation).
Edit: Netlify Edge functions seems to be pretty much exactly what I'm describing (drop js/ts files in a directory, deploy with your static site). The only problem with it for me is it's not self-hosted. Serveless functions would work, tho the separate deployment adds a layer of complexity, and I'd have to learn CORS finally lol
Cloudflare Pages works well as a serverless option for this, and without the separate deployment for functions, so no CORS required. Couldn't recommend more.
Maybe don't focus so much on the static site part, and have a very slightly dynamic site where the majority of your content is just paths that lead to stuff in /static. So rather than trying to bolt a dynamic bit to static hosting, making the simplest site with something like server.js
Not exactly what you asked for, but if you use S3 + CloudFront, you can use javascript functions in CloudFront that can modify the request to/from the client & to/from the S3 bucket. Does things like URL redirects, header manipulation, auth... I use it to redirect ~/path/to/item/ to ~/path/to/item/index.html for my static site.
Please don't use this unless you wish to exclude those who would like
to enjoy your content but also respect their privacy. Cloudflare
continue to block Tor traffic by default. It is discriminatory against
innocent users and harmful to freedom of speech (because speech also
requires the ability to listen).
There's also a centralization issue too. Remember when Cloudflare went down a few months ago and like 30% of the internet went dark? Letting one company control this much of the internet is bad.
Are they blocking you outright or are you required to do captchas?
From a 2016 post CloudFlare states they don’t block Tor traffic and don’t let their customers block the traffic either but they will put restrictions such as captchas in place [1].
Sometime last year I discussed this in more detail. I use a text only
browser with Tor. Most websites (still > 80% maybe) load well, but the
overwhelming majority of those that do not are served via Cloudflare.
To me, their 2016 post claiming not to "block Tor traffic" is
disingenuous, since they are exceedingly hostile to it. I only ever
experience Cloudflare as an obstacle and nuisance online.
I also deeply dislike their attitude and PR stand, which is
essentially victim blaming and disrespectful of those who make
different technological choices. Their message seems to be;
"We make an effort to sound sorry to those who are harmed by our
business model. But you are a minority, and we make a lot of
money. If you want to make an omelet, you gotta break some eggs. Now
get out of the road."
The protection Cloudflare provides is antithetical to privacy because they are in the business of detecting malicious users. To do so, they need to distinguish users (including and especially users taking active measures to fuzz their identity, since there is heavy correlation between such use and malice... i.e. there are a lot more people fuzzing their identity because they want to do something bad than there are fuzzing their identity because they're simply privacy-conscious).
This is a subset of a larger push-pull on the privacy needs of users vs. the integrity needs of service providers; the modern threat model is a lot more complicated than it was in the era where you could deal with an attack by black-holing an IP range. For example, Google's login flow requires (required? this may have changed) JavaScript because there are attacks possible in non-JS HTML that Google cannot protect against without using JS to do DOM inspection. Does enabling JS also allow for various privacy risks? Yes. But it increases user security.
Fascinating. Thanks for your thoughts. You've stated the problem,
comprehensively touching on all the well known talking points, value
balances and even sequencing them as if to imply causality. That's not
bad. But also it doesn't advance the argument to restate the very
circumstances and reasoning to which I object.
Maybe your contribution will bring clarity to others. And moreso if I
also add that this is precisely the moral arithmetic, and conclusion
that technical necessity excuses harms, which is unacceptable.
It's harm tradeoff, not harm excuse. It does harm user privacy.
But it's also harmful to the Internet at large (as in "all the users of the Internet") if service operators can't keep a service online because it's swamped by malicious users (or, arguably worse, it is online but the nature of its use is so badly understood by its operators that it's serving as a springboard for larger, more coherent attacks).
Services like Cloudflare allow operators to outsource the knowledge of how to mitigate those issues. This increases the total services that can be provided online by lowering the knowledge floor via specialization, which makes the Internet "bigger" (in terms of more things you can do with / on it).
> It's harm tradeoff, not harm excuse. It does harm user privacy.
I am looking from the viewpoint of someone whose privacy and
opportunity are harmed, so of course I have my biases. :)
> But it's also harmful to the Internet at large
A good argument to try, but not sure this "nebulous" harm, as JS Mill
might say, really works. For many reasons; "The Internet" hasn't been
a coherent, level entity for some time now. No doubt you've heard the
term "splinternet" - something to which I actually think problems like
Cloudflare contribute. And there's an implication that a "service
provider" somehow outweighs a single user. Which seems nonsense since
many "services" are one man shows with a handful of users while there
are some individual users of great prominence, power and
value. Besides, the Internet in it's "virgin" (most unharmed) form
might be said to be purely peer-to-peer. The nebulous harms you
propose really apply to a certain "kind" of internet, supporting
certain kinds of interests.
> Services like Cloudflare allow operators to outsource the knowledge
> of how to mitigate those issues.
They are outsourcing action, not just knowledge. Like a private police
force Cloudflare are actively (and literally) intervening in third
party business and taking punitive actions against individuals based
entirely on their judge, jury and executioner logic. That is a lot
less innocent than you make it sound. The users are outsourcing their
judgement, while swerving their responsibilities as netizens.
> This increases the total services that can be provided online by
lowering the knowledge floor via specialisation, which makes the
Internet "bigger" (in terms of more things you can do with / on it).
As we've discussed in these pages many times, and under many topics
and titles, growth is not an unqualified good. Scale is not
unquestionably desirable. Quality is rarely commensurate with either.
So I am not swayed by the argument that having some of the network
avoidably broken is justified by extending its size.
I see your concerns, but when the system was built, at the protocol level, to be heavily trust-assuming, but many individual users are untrustworthy, and you can't distinguish them without collecting information that could be considered privacy-violating, what is the solution?
I, for one, have a blog that I don't use Cloudflare for. There's a risk that my system gets hugged to death and I don't know until my service provider either notifies me or cuts me. And from a certain point of view, I might be considered a negligent actor because I'm not collecting enough information to know if somebody has breached my blog engine and turned it into part of the Low Orbit Ion Cannon. But I've chosen to value user privacy.
Point is, trade-offs. I don't think I'm in some kind of moral right space for my decisions, I've made them based on the kind of reader I expect to get.
Solving some company's CAPTCHA Sudokus and getting no compensation for your time and their training model is not freedom. I get this BS all the time because my IP isn't located in the West.