Hacker News new | past | comments | ask | show | jobs | submit login
On the web today, we don't use servers, we use services (misc.l3m.in)
49 points by sodimel 41 days ago | hide | past | favorite | 41 comments

>On the web today, [...] In order to submit an edit for this page, I hosted all my code on microsoft's servers.[...]

I assume you're talking about people using Github instead of hosting their own server?

But before Github, people also avoided hosting code on their own servers or paying $10/month for a datacenter VPS -- by using previous services such as SourceForge or Codeproject.com.

If a bunch of people need to do activities <X> and <Y> but <Y> has higher priority in life, then some other service conveniently doing <X> will emerge. That's what Github is... it's a natural emergent phenomenon arising from people not wanting to mess around with running self-hosted Gitlab on a laptop, or Raspberry Pi, or on a $10/month Digital Ocean droplet, etc.

Agreed, perhaps the reason so many people use this service is because it may be because it solves a problem.

But another approach (my point on the rant) is the "I don't want to know how things work, I just want them to work" approach, which I think more and more people are using these days (we can relate this to mechanics, where everything is getting more complex). Because things are getting more complicated under the hood but seem "easier" to understand (because we're adding piles of abstractions on top of piles of abstractions), fewer people tend to take the time to create things with the "old" technos and really learn how things works.

(ps: this exact txt file is self-hosted on a dell optiplex fx160 with 4GB ram and an old processor, bought $20 in 2017 on ebay)

I know exactly how to self-host - I ran Trac for years - which is why I keep all of my projects on GitHub now.

I know full well how much work it is to keep something like that maintained and secure and properly backed up.

I also know how many things can go wrong - including nasty things like missing billing emails when a credit card expires and losing the associated instance.

"GitHub, by default, writes five replicas of each repository across our three data centers to protect against failures at the server, rack, network, and data center levels" - I can't compete with that! https://github.blog/2021-03-16-improving-large-monorepo-perf...

I don't really know what to think for big, critical applications because I've never had to manage one myself. I can agree that looks very difficult and time-consuming, and that indeed companies devoted to the sole task of maintaining all the stack up and running are amazing.

However, for small-scale projects, personal blogs or website, I really like the idea of self-hosting. Knowing that all the data are in my home is reassuring (apart from the fact that my house could catch fire).

(I realized that you're one of the co-creators of Django; thank you very much for this masterpiece)

> apart from the fact that my house could catch fire

Your house feels like a fortress, until something happens to you or someone you know. Not just a natural disaster, like a hurricane, tornado, flood, or earthquake, but small disasters too. House fire, as you mentioned, or having your house burgled, some crazy power company glitch, or something as mundane as a child with a cup of water in exactly the wrong place. Even if you are meticulous enough to drive backups off-site backups, that somewhere is probably close enough that it's possible for a natural disaster to take out your house and your backup site. Encrypt your data and save it somewhere geographically far away, if it's really important to you. It's doable manually but for everybody else, imo, an online service (eg Dropbox or Google Drive) is easier, which means it's more likely to happen and not fall off the bottom of the Todo list.

I have thought about this some.

I do prefer having local copies of everything, because I don't trust The Cloud. Malicious actors can get your account shut down, and then all your stuff is gone, locked up and the key thrown away.

My solution has been to run a RAID 1 NAS device for all my local stuff (including all my media I want to stream in the house - nice having movies and TV when internet is down). 4TB drives, rated for NAS usage.

Important documents, and sentimental photos (many GB's worth from the last few decades) I periodically back up onto an external SSD and put in my fire safe with physical documents of importance.

FYI, fire safes aren’t necessarily waterproof. They’ll protect your documents from the fire but not the firefighters’ water, or the drop through the burned out floor to the basement. Mine, for example, requires an additional treatment to be waterproof. Something to keep in mind if you haven’t already.

(I keep my backups in three places: local drives, safe deposit box rotated quarterly, and BackBlaze. I’ve only ever needed the first, but I feel safer knowing I have the other two.)

Crap. I guess I had not thought about water proofing.

Probably important as I love next to a river.

I also no longer self-host a number of services myself, but thanks to "the Cloud", it's fairly trivial to "write five replicas of each repository across our three data centers to protect against failures at the server, rack, network, and data center levels". Uploading data to AWS S3/other cloud storage (with some additional, but fairly trivial scripting on top) gets you that same level of redundancy - or better! I don't want to be the person on-call when my self-hosted server's hard drive dies in the middle of my vacation, among other reasons, are why I use eg, GitHub.

There's an informed decision to be made when graduating from a computer in your basement, to a colo'd server, to using a VPS (or dedicated server) in the Cloud, to using a consumer facing online service. But if you're willing to have your data in the cloud, replication that used to take setting up multiple colo'd machines across the world (which I couldn't compete with, either) is fairly trivial these days. (A HA failover setup is left as an exercise to the reader.)

> Because things are getting more complicated under the hood but seem "easier" to understand (because we're adding piles of abstractions on top of piles of abstractions), fewer people tend to take the time to create things with the "old" technos and really learn how things works.

That isn't necessarily true.

I suspect the same number of people are taking time to really learn how things work, but the abstractions are allowing more people to get involved in writing and contributing to open-source software, developing services, etc., so the apparent fraction of people doing stuff on their own is lower. But if it weren't for the abstractions, it's not that we'd have more people doing things under the hood, it's that we'd have people not doing them at all.

I'm one of the folks who insists on knowing how things work under the hood, and it's an enormously valuable (and fun) professional skill, but it also frankly limits what I can do. I won't write projects in React because I don't really understand what React is doing and I don't feel comfortable with that, so my dashboards are cronjobs that template out some HTML and upload it to static hosting. (Which is fine, my job isn't making dashboards, my job is debugging prod systems.) Meanwhile my coworkers are throwing together incredibly impressive UIs very quickly. If it weren't for React, neither of us would be writing these UIs.

I remember open-source development in the pre-GitHub days. There was, frankly, a whole lot less of it.

That's why we need open-source software like GitLab that you can self-host.

For blogs there's Wordpress. For stores there's Magento. For files there's OwnCloud. Sadly, for communities and collaboration (Web 2.0) there isn't that much. Matrix, Mastodon? They are nowhere near the functionality of Facebook, Telegram, LinkedIn et al.

I spent 10 years reinvesting all our profits into building an open-source competitor to those centralized services, and it still needs some work to rival them: https://qbix.com/blog/2021/01/15/open-source-communities/

But I'll tell you something. Having your OWN servers and data is very attractive. That's why everyone moved away from America Online "keyword NYTimes" and towards hosting a web site on nytimes.com using the open HTTP protocol. Today, Facebook's "NYTimes" page is analogous to AOL Keyword NYTimes, Mark Z is analogous to Steve Case, and notifications are analogous to "You've Got Mail!" Nothing new under the sun... now we just need something like the Web to come along and disrupt them by letting everyone self-host their own stuff on a service that's almost as good. The Web Browser in the beginning only had Bold, Italic, etc. but publishers switched in droves, and users followed.

> Having your OWN servers and data is very attractive.

This is true for businesses that can justify the cost of self hosting for a variety of business reasons.

This isn't true for most end-users, who just want reliable services. Hence, Wordpress.com vs. self-hosted Wordpress.

I don't want to be personally responsible for my data; I want a professional organization to make sure it is safe.

I think I'm getting downvotes because people don't want this to be true. However, I'd like to hear an argument that makes my statements above not true.

For example, my parents are never going to self host their own photos unless they're confident that it is safe and reliable, and takes almost no effort on their part. They don't want that responsibility, and frankly, there's no need.

OSS can be built around businesses that take responsibility and host for others. Part of why so many of these platforms fail to gain mainstream support, is that they're not built to support a solid business case, and instead somehow expect everyday people to learn far more than is realistic, just to replace a service that is free and easy.

That doesn't have to be true. Wordpress, as mentioned above, is a very good example, as it provides both options. (Self hosting, vs. paying someone else to take that responsibility.)

We have an initiative in France where local non-profit orgs setup and maintain such OSS services (for free or for a small fee). It's called "chatons"[0] (kittens), and it's great :)

[0]: https://www.chatons.org/

That's not self-hosting. That's getting services from a third-party organization that happens to be a non-profit. The only difference between that and GitHub is that GitHub is a for-profit company, which setup and maintain services for you (for free or a small fee).

So if the actual complaint is "stop supporting for-profit companies," sure (but then we have to ask why - there are reasons to expect that a for-profit company is likely to be more stable long-term, more likely to be secure, etc.). If the complaint is "GitHub / Microsoft in particular is bad, and chatons in particular is good," sure (but then, what about for-profit competitors like GitLab?).

But that's unrelated to the original complaint. As you say, chatons is maintaining OSS services for you to interact with.

I am pretty sure you get downvotes for the reason you stated. On HN it is a signal that what people disagree with what you’re saying — though I strongly feel it should’t be, even pg said it’s a valid reason.

Anyway, I am for actually discussing substance so here it is:

With Facebook, you only have one possible landlord - a monopoly. With Wordpress, you have a choice of many landlords who compete on price, location, and so forth, knowing that you can take your business elsewhere.

In short it is like going from digital Feudalism to Capitalism. See this for much more info:


That is why I started Qbix 10 years ago. But there is something even better than capitalism and privately owned hosting companies: autonomous, self-balancing networks where dumb pipes carry encrypted messages. Blockchains are just the beginning. That’s why I started Intercoin.org but some projects like MaidSAFE and Matrix.org are ahead of us in many respects (for storage).

If you aren't personally responsible for your data, does it really exist? Is it really yours? What if your professional organization says "we've installed some big data analytics", or "whoops we lost all your data"?

When I give a dress shirt to a professional dry cleaner, I'm not personally responsible for it while they're cleaning it. Its still my shirt. I'd consider using a different dry cleaner if they started keeping notes on what they thought I did to make my shirt dirty. If they lost the shirt, I'm angry and may even get some compensation, but the shirt is still gone and someone else probably has it.

I completely understand that the scale and scope of abuse in tech can be much higher than with other services due to automation, but I don't see how its reasonable to question whether my stuff is still my stuff because I've entrusted it to a third party.

The laws around data are completely different, at least in the USA. In general, if someone else has "your" data, it's actually their data.


> If you aren't personally responsible for your data, does it really exist?


> Is it really yours?

Yes, though obviously it depends on the service, and my agreement with them. Most of my photos, for example, are backed up in OneDrive, which works quite well for my purposes.

> What if your professional organization says "we've installed some big data analytics", or "whoops we lost all your data"?

Good question! My photos, for example, exist on my local computer, Microsoft OneDrive, and on Google Photos. The odds of losing them are, frankly, as non-existent as possible. Multiple storage locations, IMO, are a good way to mitigate the relatively low risk of losing cloud data. Also, almost no effort on my part is required.

I think small businesses don't want to spend X days per year to manage a self hosted solution, it's cheaper to just use a gafam service.

(and I think that's a problem, OSS softwares today are ~reliable enough for end users, but not really for companies)

That is why we have IPFS and soon, SAFE Network!

Look them up.

After seeing how big of a hard-on business drones have for OpEx vs CapEx expenditures, I don't think the cloud is ever going away completely.

Like we have an entire cloud "revolution" (note the mocking quotes) because of a bullshit accounting rule!

The 2017 Tax Cuts and Jobs Acts (aka Trump Tax Cut) actually permits (or just increased limits for?) taking a full deduction, up to $1 million, for computing hardware, rather than following a depreciation schedule. See https://en.wikipedia.org/wiki/Section_179_depreciation_deduc... and https://www.irs.gov/newsroom/new-rules-and-limitations-for-d...

We'll see if that catches on and makes a difference. Accountants, like software engineers and every other profession, tend to follow trends.

I agree and I would like to recommend Nextcloud instead of Owncloud for anyone who needs more features than a NAS. Nextcloud has excellent support for calendar and contact synchronisation. There are a ton of other apps you can instal too: apps.nextcloud.com

https://zotsite.net/ or https://friendi.ca/ approximate Facebook, Telegram, LinkedIn on features (ignoring network effects).

I had a choice a few years ago.

I could have 5-8 NUCs/RPis, install a slew of conventional services I was curious about, and consider things like power, cooling, cables, storage.


I could pay GKE for a k8s cluster, install most of the things I was curious about, and avoid dealing with, e.g., corruption of the SD card. Or the cat knocking the desk-rack of NUCs off.

since rack, stack, and on-prem issues are not what I was wanting to sort out, I chose to deal with what I wanted to learn, rather than rat-hole down data center admin issues.

I've been doing this work for years and years and years. I was scrounging boxes from the CS department for research 15 years ago (or more). Servers are a fundamentally flawed unit of reliability and development. Moving to a design world where you design for someone to knock over the box and _the computation keeps going_ is paradigmatically better.

When you deal with forward looking people in the server-centric world and you show them a service / "keep going" world, their mind is blown and they want it IME.

I will say that in the most part I agree with you. It is better to live in a civilisation, pay tax, have the litter collected, the roads paved, rather than trying to roll your own.

However, being able to, to varying degree, opt out of said civilisation when it runs counter to your ideals or becomes to authoritarian or means that existing within it could compromise what you believe in, is important. The cloud is great, but we must strive to have the skills to be self reliant so it is truly a choice to participate in it to the extent that we do.

If you wanted to submit a change to a page back in the day, you e-mailed someone a patch. That required "signing up" for an E-mail account (unless you were a weirdo who telnetted to an smtp port and hand-crafted a mail delivery) and using someone's e-mail service, after you used a local application to craft the patch.

The big difference today is that we don't use native apps, so we have to have accounts, to use somebody else's apps for free, because they want to track us and make money from us.

On the web today, we don't use apps, we generate revenue for others via their apps.

I think the interesting question is: what really qualifies as a service?

IMHO, a service is just a program someone else manages running on a server, that I can use remotely with a REST API.

In college, in the 80's, I used the symbolic math program Maple on an MTS mainframe. Is Maple a service? I say no, only because services tend to be web related, they have a REST API accessed with HTTP. But if Maple supported a REST endpoint, bam, its a service.

In this OPs example, the "service" API is email, not HTTP. But it achieves the same goal.

What I think the OP misses is: who the heck wants to log on to the server and tweak the site content from the unix prompt when there is an API?

I think it depends. For small, fast-writtent rants, txt files are great (my crap-computer-that-is-running-next-to-my-box didn't even burned when the post reached the front page of hn).

I really see the point in not having to manage a whole big ecosystem by ourselves (I use gitlab, sometimes github). The rant was about using an interface on another website (not the email API) in order to submit a PR or an issue for updating a website. It's a standardized way of doing an edit (and it can be painful when you don't know github), and more and more websites are using it.

> the rant was about using an interface on another website

oh, i see. i overlooked that. thanks.

That's a very specific definition for service. What's going to happen to that definition when technology eventually moves past using REST APIs?

A program running on your school's mainframe that you access via some sort of terminal is a computing service from before the Internet.

Plenty of people use a Unix prompt to access both local and remote computers. That its not your preferred method is fine, but there are people out there that prefer it! If those command line tools are hitting the same API (eg using curl), what then?

Yes, that is why I pointed out that "email" is also an API. Do you think a person reads the email in the OP's note? (not being sarcastic) I'm pretty sure it triggers an update through an IFTTT-like flow / post-hook on receive.

In most cases, if you want to change something you can also send an e-mail with your desired changes, without registering on any platform.

"On the web today, we don't use protocols, we use applications"

Servers vs services is an odd way to frame this discussion.

Seems like the specific problem is pull requests are not a core git feature in that they cannot be conveniently accomplished in a secure and decentralized way.

How did you come to that idea?

Pull request is common feature for maintainers in the Linux world, and they don't use github to send them to Linus.

And on top of that they are secure (signed tags) and decentralized (maintainer can publish them anywhere)

And yet everyone uses the Github or Gitlab tools because it's still not convenient enough for most teams, sadly.

It's an annoying problem because you need decentralized identity. Signatures handle that but the UX is not as nice as the centralized github/gitlab account.

The design of decentralized PR hosting also doesn't play nice with the github/lab UX. If you were trying to create a PR to a github repo you'd have to handle authenticating the github pull from where-ever you hosted. If the repo isn't public that seems very cumbersome even if it was supported.

It's just much easier to do this entirely inside github/lab, annoying as that is.

Yes, indeed we are.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact