There's a reason GitHub is the de facto host for OSS projects, I'd posit that open source maintainers who deliberately choose an unconventional host like GitLab already increase the barrier to contributions (not just due to network effects, but an unfamiliar UI, feature parity, poor performance etc) and this change would've already tipped the balance in my view and made it even clearer GitLab is not the right choice for OSS.
There may have been reasons to do this in the past (e.g. lack of integrated CI) but I'm not convinced those arguments really hold anymore in 2022. If you're a maintainer and prioritising interoperability and the lowest barrier to entry for potential contributors and you don't care about it being proprietary/owned by MSFT you pick GitHub. If you're prioritising a FOSS stack, good tooling offering and low-weight pages with excellent performance, you pick something like SourceHut.
For everyone reliant on software published by maintainers who have already made this decision historically, it's reassuring the right decision has been made here, but a bit worrying they were needing to consider a step this far in the first place. Combined with their poor pay for engineers (discussed to death here on HN already), it does make you think a bit about their finances.
After Copilot, and their complete refusal to explain their stance towards software license compliance with CoPilot, I don't think sticking to GitHub is the right thing to do.
And GitHub can afford to offer all of these things for free because they are owned by Microsoft and they are playing the long game.
CoPilot is probably one of monetization steps coming.
I have been planing to switch to GitLab because of it, even though I don't have any important public projects currently.
If you believe that it's important for free software to use free software infrastructure (and you should[0]), then there is no choice but to do so. Infrastructure like GitHub is primarily a function of network effects and the only way to overcome them is to cast your lot appropriately. A maintainer who "doesn't care" if their infrastructure is proprietary is not doing a good job.
Note that GitLab is not really the answer to this problem. It's open core, subservient to its investors, and pulls a lot of stupid shit like this. If you want the GitHub-style workflow and UI, go with Codeberg or another Gitea instance.
Seems an awful lot like perfection being the enemy of good.
Github is and always has been a closed platform. It's also responsible for the largest explosion of open source activity from any single source in the last twenty years. Your argument relies on the assumption that the cost of migrating away from Github is a cost a FOSS project can pay and survive, which is a pretty heavy assumption as far as those things go. Even if the alternatives were comparable in UX, and they're not, there are entire ecosystems (Node, Golang, RubyGems, brew, etc.) that explicitly depend on Github. If you measure your software's success by any metric other than how well it aligns to your ethics, it's hard to justify anything but Github for a FOSS project.
Most people, IME, write FOSS software so people can find and use it. I remember the era when SourceForge was more-or-less the best thing you could hope for as far as central repositories were concerned, but they weren't much more than free static web hosting. I remember how difficult it was to contribute to Apache projects because figuring out their esoteric source control and patch process was so daunting. I remember googling for software packages to use and not having an easy route to quickly gauge whether its quality was worth investing further research. I wouldn't like to go back to a world where everything was hosted on free software if it means losing all the knowledge that exists in that one place.
There definitely seem like more important windmills to tilt at than pointing at the engine that drives the entire FOSS ecosystem and saying it's not good enough by the most stringent of ethical standards.
Many projects, some of them among the most successful projects in the world (Linux, PostgreSQL, Debian, others), are not on GitHub and are doing just fine. I reject the axiomatic statement that projects cannot survive outside of GitHub -- it's plainly ridiculous and trivially disproven. Thousands of successful projects do. Arguing otherwise contradicts the facts and serves only to justify the status quo.
Many people write FOSS for many different reasons. Chasing popularity is among the worst, and using GitHub is a poor way of achieving it regardless.
GitHub is not good enough. Placing all of our FOSS eggs in a proprietary basket which is incompatible with FOSS values is an absolutely foolish thing to do. The alternatives exist and are equal to the task, if not exceptional.
> A maintainer who "doesn't care" if their infrastructure is proprietary is not doing a good job.
This isn't to mention also the issues regarding code ownership, where GitHub now trains their Co-Pilot AI on code under any license.
> Note that GitLab is not really the answer to this problem. It's open core, subservient to its investors, and pulls a lot of stupid shit like this.
Even if GitLab pulled out on their payed-tiers now, it's too late. This has been coming for years with the paid version of GitLab features vs the free version, which always ran behind on functionality.
We are currently considering our options to move away from GitLab. Ideally we look for tickets, merge/pull requests, CI and the ability to allow the public to contribute.
@ddevault We have been looking at SourceHut at an alternative. Specifically some questions for you:
1. Will you have a non-email merge/patch system for SourceHut in the near future [1]? The problem isn't so much for our team, but to encourage external people to help with our project who are used to the likes of GitHub and GitLab.
2. With regards to pricing for SourceHut, how will the pricing look in the future [2]? We have a small project that could pay per person/group, but would struggle to pay what GitLab are asking [3]. GitHib pricing is better, but obviously your code then becomes the product.
Great work in any case, I really enjoy seeing your blog updates.
Regarding copilot training issue, I wonder if they can just train on any open-source projects that they can access, e.g. projects in SourceHut.
I remember their argument is that their training is considered fair use, legal/moral questions aside, doesn't it means that they can also justify using projects hosted on alternative platforms for training?
If your code is open source and matters at all, people put it on GitHub anyway. I refused to self-host any of my stuff on GitHub, but there are a ton of forks and mirrors of old versions there anyway (and then people get confused being unable to find the real copy because there seriously are asshats out there searching for code repositories exclusively on GitHub instead of Google).
While I am all for hosting elsewhere (and do so myself) I will admit to being one of these asshats. Lots of projects have semi-common names and Google/DDG will often surface irrelevant results. I often find it easier to find projects by just searching GitHub instead.
i am training my brain on the back of the hapless and poor authors like lee child and j.k rowling that i got off of piratebay. why do those "pesky rights" holders release their dogs after me?
1: We are working on a web tool which uses emails under the hood but is otherwise quite similar to pull/merge requests. We have already rolled out the contributor side of the process, which you can see demoed here:
2: TBD, but we will consult with the community before making any final calls. There will be at least 90 days notice before any pricing changes once the long-term pricing is finalized. It definitely won't be anything near GitLab's extortionate pricing.
I would argue that choosing a closed platform like GitHub where users must sign their data and more over to and which (a) has absolutely no provision for contributing in any way without signing up (and thus giving them a wider "user base"), (b) is an entirely proprietary, for-profit platform built on the backs of open-source, and (c) is owned by a company with such a long history of ill will towards open-source; is worse.
(Not that I think GitLab - at least in its cloud form - is much better, but alas.)
I've had open source projects be dormant for 10 years only to get a question pop up out of the blue from someone who found it useful.
I hope that Gitlab and other hosts of the open source community think hard about this before implementing any policy like this. There are projects that are useful, and stable, but for whatever reason do not have a lot of users. There is no reason to let those projects die, or force the maintainers to have to do a song and dance to keep them around.
Yes, there's a cost, but there's a benefit too, and I hope these hosts of the open source community remember that.
It would also punish programming languages with good long-term support. For example, Ada has been deliberately designed for long-term maintainability. Many Ada packages haven't been modified for years because they are finished. They work as intended. Unless someone finds a bug, there is no need to update or change them.
Not everybody considers constant feature creep desirable.
Ditto. Sometimes stuff reaches a point where it's mature or stable and doesn't need more development. Sometimes life gets in the way and years down the road someone stops in to pick up the torch. That's open source.
I have a lot of projects from >10 years ago which are mature, stable, used, and the only time they see active development is if there's a breaking change elsewhere (a lot of them interact with other tools).
I also have some amount of open science stuff. I count on github / gitlab being archival for things like archiving how I process data in some piece of research.
Literally all I really need of a git host is to support the git protocol, and to keep stuff around for me forever. I appreciate having pull requests, conversations, and what-not, but the #1 requirement is having my stuff there, reliably stored, and available to the world.
I'll mention: I don't blame gitlab for a decision they didn't make. If you've ever worked in a corporation, you'll know a lot of stupid decisions ALMOST get made all the time. gitlab tends to be more open than other organizations, so we see the inner workings. I've always seen them pull back when a decision is particularly numbskull.
I use this small utility in Linux called mdump to test multicast on a semi regular basis. It does what it is supposed to and I never ran into any issues. I just happened to look at the source code again the other day and it was last updated in 1994!
The crux of the issue is that Gitlab is a commercial enterprise so at one point it is reasonable for them not to want to host dormant projects for free when it costs them money to do that.
In fact, even if they weren't a commercial enterprise they would still need to balance the books in order to survive.
Few years ago, I tried to un-squatting an username from both GitLab and GitHub (yes, GitHub were doing this back then), both with success. During the attempt I studied the policies for both service and noticed the difference between the two: For GitHub, they were lean towards not to release the username especially when the account is valid and not "empty"; For GitLab however, they will release the username if the account was inactive (not pushing/posting issues/login etc) for a period of time (See [1]).
That knowledge about GitLab left a weird and confused taste in me. It gives me an impression, that is if a free tier user becomes inactive, they're no longer valuable for GitLab and can be treated as second-tier.
For that reason, I cannot seriously being myself to use GitLab. If you knew that all of your investments will rot to nothing as soon as you stopped paying, you'll probably consider to stop investing it.
Sadly, this idea of removing "dormant" projects followed the same train of logic: If you're not constantly contributing (paying/publising content) to our platform, you're second-tier, and we can just kick you out.
From the previous article where they announced the plan:
> GitLab is aware of the potential for angry opposition to the plan, and will therefore give users weeks or months of warning before deleting their work...
GitLab don't seemed to understand their users well. "Angry opposition" is what might happen if GitHub decides to remove dormant repositories, not GitLab. GitLab users will just feel disappointed and leave (to self-hosting, or maybe host it (back) on GitHub).
GitLab should seriously re-learn what's more important for their service: user engagement or the code hosted on it. Tip: GitLab is not running a social network business.
"That knowledge about GitLab left a weird and confused taste in me. It gives me an impression, that is if a free tier user becomes inactive, they're no longer valuable for GitLab and can be treated as second-tier."
GitLab has an explicit program for approved Open Source projects:
This is their philanthropic effort. They need to remain financially solvent to provide anything long term, so of course they need to be careful how much they're giving away and for how long.
If you're not in the approved open source program, you should consider your usage of the "Free Tier" as part of their freemium business model whereby they hope to convert you at some point into a paying customer.
> GitLab should seriously re-learn what's more important for their service: user engagement or the code hosted on it.
They're a business that needs to pay the builds (including the long-term costs of storage). If they haven't approved your projects into their open source program, you should consider paying them to host the code.
I have my own issues with GitLab, but I won't judge them for the need to pay the bills.
I do understand that they need pay for their running cost, but to be completely honest, I don't think that they're running the business in a smart way.
When GitHub took off, they were not offering private repository at all. Instead, they added it in slowly when they can afford to do so. This is one of the reason why GitHub has now become the #1 platform for open source projects around the world.
GitLab is trying a different approach by focusing on offering repository hosting as service, but as time progresses and GitHub grows, this business model became less and less attractive. As a result of it, now days GitLab mainly attracts people who thinks GitHub isn't the right option for them, and that's not a big market. No matter how many cost they cut, if this remain unchanged, GitLab is already on it's dead bed (by that, I mean stop growing, not close for business).
GitHub is smart because at very beginning (before allowing private repositories), they knew that if they want to attract good programmers and projects, they must also accept and respect garbage, because that's what it takes to build their trust.
Deleting user data however, always destroy trust. If storing those repositories creates unbearable cost for GitLab, then maybe introduce a reasonable cap and then rejects new commits once the cap is exceeded (while asking user to buy more spaces). Making the entire repository just disappear is too much.
I'm glad that they did not end up choosing that route, but at the same time, not choosing that route should be the default, not something you just realized after it went to the media.
I was a paying GitHub customer, but they had such a crummy private repo policy at the time (I think only 5 private repos were allowed) it forced me to make poor technical decisions: "I'll delete this github repo, keep my local copy, to open up a slot." I switched all my private repos to gitlab and stopped paying for github and was happier for it.
Later github changed its policy to something saner, but I never switched back to using github for private repos. This decision by gitlab even if it's rescinded, however, might give me the impetus to do so.
I don’t know if this issue has been resolved, but as late as four years ago, GitLab didn’t have a notion of a service user. If you wanted a server to authenticate and pull from a private repo, you had to use a licensed account to do so. This meant your 5 user tier became a 4 user tier.
I had forgotten why I default every little hello world type repo I create on GitHub to public. I knew there was a reason I started doing that but had forgotten.
The five slots were precious and I didn't want to waste them.
Except that projects are namespaced under their owner, so you'd need to hijack the owner's account. They don't appear to have been proposing deleting users (or groups), just projects.
Other than that, you can fork whatever you want today...
> It gives me an impression, that is if a free tier user becomes inactive, they're no longer valuable for GitLab and can be treated as second-tier.
You can't, without paying, sit on a namespace for years and do nothing with it, if there is interest from a user willing to pay. And even if that is what you're doing, gitlab will contact you before taking back the account to make sure you don't suffer data loss. I don't know, this seems pretty reasonable to me, especially considering gitlab is a business and provides you with a free service.
You just have to log in more than once every two years!
I don't blame Gitlab for not understanding the levels of self-entitlement of some of its free users. And maybe, just maybe, Gitlab is better off without those users. In my experience, the cheap customers were always the worst and most demanding.
Regardless of reasons ( I'm sure there are some good ones ), this was a "small" mistake that reveals a lot.
I don't know if the people running Gitlab are dev/ops guys who found themselves running the ship or some high pedigree managers but this reveals an enormous blind spot on this group of people.
One of the major driving forces for humans is loss aversion. Doesn't matter the project is abandoned or some garbage you don't value, these guys are on the business of banking for code + services and they just announced to the world they can't be trusted as bankers.
I'm sure that wasn't what they were looking for and their monthly bill is big ( perhaps too big ), but setting fire to the whole thing isn't the right move.
>I'm sure that wasn't what they were looking for and their monthly bill is big ( perhaps too big ), but setting fire to the whole thing isn't the right move.
It's owned by Microsoft, a for-profit company with a long history of hostility toward open source anything.
Literally setting fire to the building and servers IS the right move if it's what brings the most money in the door and keeps as little as possible from going out.
They don't and never will care about customer goodwill for something like their paid service, they'd stop offering it tomorrow if it made money for them.
The fact that they walked this back so quickly suggests that they either are strapped for cash or have extremely out-of-touch leadership. Both are worrying.
Is the title really accurate though. They never made a public announcement saying they would do this. Someone just leaked an internal discussion and it blew up.
A lot of things get discussed and evaluated that never eventuate.
Indeed. The fact that they're also limiting free organizations to 5 members (counting members of sub-groups)[1], which feels truly arbitrary, seems to agree.
IIRC, GitHub always had this limit. They removed it sometime after the acquisition, which could be because they have no pressure to justify the valuation be revenue.
While they have 884M cash/equivalents, they did have a yearly 157M Net loss, compared to 192M net loss last year. That's some good improvement but it means there's a lot of work to be done before they reach profitability. So I don't think they're strapped for cash, but there is likely pressure to make enough changes to reach profitability within 3-4 years.
They burned $129MM and forecast a loss of $142MM next fiscal year.
Given that they recently went public I’d imagine they would be trying to become profitable. The IPO appears to have netted them around $600M in cash, or about four years of runway.
I was kind of surprised that the hosting of such stuff was a significant enough cost that removing it was seen as worth it - but I have no idea what their annual revenue is, and a million dollars is a million dollars.
That said, they were talking about deleting dead projects from unpaid accounts, and it does seem kind of shitty that as a community we seem to have decided that once a company has hosted some content that they should be required to host it forever at their own expense.
> it does seem kind of shitty that as a community we seem to have decided that once a company has hosted some content that they should be required to host it forever at their own expense.
This is a really great insight that I hadn't thought of. Sure, if they, from the start, had promised to do so, that would be one thing. But users of free plans are essentially freeloading (sure, there are useful network effects of having free users on the platform), and it's a bit entitled to expect that free services are free in perpetuity (especially from a company that isn't profitable), even for projects that might be abandoned.
The thing is, policies like this make it really hard to manage open source projects, which (a) typically do not make a lot of money, and (b) in the very long tail, are sometimes "dormant" according to this metric. And the cost goes up the more projects you have and the more services they're spread over.
If they really wanted to save only on costs, surely they could have gone for the biggest repos first? E.g., politely email owners of large-but-inactive repos to consider spinning them down, and then if that fails to move the needle, then put in a policy (starting with big repos first).
There's no reason for someone's 1k LOC project to get swept up in something like this. The costs for that must be trivial.
> as a community we seem to have decided that once a company has hosted some content that they should be required to host it forever at their own expense.
We need something like public (book) libraries for code. Or a Library of Congress. FOSS has become critical infrastructure and we need to act like it. Done well, this could also help foster people learning to code. Sure there will be some political issues with it around funding and what code we should or shouldn't host, but the benefits would outweigh the drawbacks I think. I've seen tax money go to worse things, at least.
We might just start having a more open landscape where code can roam freely to wherever you want to host it. In other words more than just two full-service providers to choose from who lock you in by network effects and FOMO, made worse because the entire software development tools vendor ecosystem is based on them too.
When it comes to open source and free software projects both of these platforms are not your friends. The "free" services are just an enabler for their revenue models, part of the business plan. Especially Microsoft is well-known for their long game and EEE strategies. If it is up to them we'll move to GOSS, Github Open Source Software, that is so dependent on nice GH services that it can never move away.
I have my hopes in more open technologies, be they low-level like DVCS, or taking care of social aspects of software development, like ActivityPub. On this last bit cool developments are taking place. Extensions to ActivityPub that will allow code forge features to federate with any vendor that supports them. We'll get decentralized software development if that is done well.
Some FOSS projects and communities are taking first strides here. A report of their activities is in this blog post of the Forgefriends project:
Gitlab projects are way bigger than you’d expect. The actual git repo is usually not the biggest part. It’s the build artefacts and docker images. The average project I have worked with on Gitlab has been in the tens of gigabytes while active repos like the gitlab itself repo are in the terabytes.
I have one project which I use all the time which is basically finished, once in a great while I have to fix some minor thing when python changes their C-API or I get bored and add some minor feature because I wanted to figure out how to do something in python. From the outside the project is “dead” as I barely ever touch it, just recompile when I do an OS upgrade and forget about it until the next python version bump.
I honestly don’t think anyone else has used it, its a python wrapper around boost:: property_tree which I use to transform rss xml to json, but it does contains a lot of magic to do things using the python C-API which are pretty hard to figure out because there’s no documentation so I think the real value is having examples to look at. I spent many wasted hours figuring out how these things worked by reading other peoples code and adding them in the easiest way possible. Might someday save someone a bit of time…maybe.
It's not costing them 155M to run hosting. It's maybe a tenth of that. Online business costs are inordinately around people, marketing, and sales acquisitions. Someone wants you to look at X to save money, because Y is more important to them. I don't know gitlab's financing, but it's a pretty common play when someone is looking to boost their figures so someone else won't notice - either for executive payments or acquisition, someone's looking to goose their margins.
In some sense, this has the same problem as TFA: who is going to seed those disused repos (same problem with ipfs linked in the sibling comment, which one can pay a "pinning node" but pay)
Gitlab is great but they don't make money out of me and I'm not sure why.
There is no reason for me to go to a paid account. There is no way that I could justify the cost to my boss. We have 5 slots for the team and only need 3. There is nothing that we'd gain from paying.
If they said tomorrow that we had to pay $60 a month to get what we're getting now, then I'd say to my manager we need to pay for this or we won't be able to keep developing our software without deploying infrastructure we have to maintain. At that point we'd start paying.
>If they said tomorrow that we had to pay $60 a month to get what we're getting now
0 to 60 overnight sounds like a very disastrous thing for a company to do, like a big ol bait and switch. I would immediately stop using that company and find alternatives, and I am guessing it will be the same for a lot of people too.
At 60usd a month for 5 people, a developer should be able to buy a few servers(~4) for less than $30-$40, selfhost something like gitea with drone.
Then set up deployments, figuring out how to replace any missing Gitlab features
Then update the docs that describe how the repo structure works for onboarding
Etc etc
Moving infrastructure is a couple of days work at least for a small team. If you factor than cost into it then it's rarely worthwhile. And that's before any time lost to investigating and fixing outages, updating the software, and so on.
People are severely underestimating how complicated it is to host a private server, even if they go to self hosted gitlab which essentially the same.
The amount of downtime, sudden server crash, problem with SSL with NAT, setting up gitlab runner, upgrading, etc. You'll need someone experienced enough and maybe even a dedicated infra role to handle those things
We've been running Gitlab for over half a decade with over 100 repos, runners on Windows, Linux and Mac with minimal need for maintenance.
Same story for the primary free software project we rely on (who we have a support contract with). Neither us nor them spend anywhere close to 10 hours a year on maintaining selfhosted Gitlab related infrastructure.
> People are severely underestimating how complicated it is to host a private server
People underestimate consistently how complicated is to build reliable things, hardware or software. There is a world between "it works for me" and "it works for all"
And dysfunctional organizations. I worked as a contractor at one place where we simply used the shared file server as a git remote.
Reasoning was if the file server was down, it would come back up quickly because it affects everyone. I didn't understand this reasoning until I got emails from the database team saying the database servers stopped working and every development team now needs to change their connection string from machinename-abc.corp to machinename-bcd.corp
How in the world someone could write an email like that is beyond me. Ok machines crash all the time but why can't they reuse the same name for the new machine?
Outside the ~4 hours I spent setting up GitLab (for the first time) 3+ years ago, I've spent an average of perhaps 10 minutes per month maintaining it (and the runner on a separate server).
(And that's literally only because I choose to update it manually since it's usually offline for about 15-30 minutes during updates...)
I actually recently did just that, moving over everything from my self-hosted GitLab instance into Gitea, Drone CI and Nexus: https://blog.kronis.dev/articles/goodbye-gitlab-hello-gitea-... (the same would apply when moving from a cloud GitLab instance to the other self-hosted solutions)
Edit: fixed the link now.
Honestly, it wasn't so bad, given that I didn't couple what I had too tightly to the functionality of GitLab in particular - having an external issue tracker, though moving CI and build artifact storage systems can indeed necessitate porting them over (which was easy, given that I mostly use Dockerfiles and shell scripts for builds anyways) and cause additional work.
> And then set up the users; Then set up the permissions; And move repos to it
Thankfully, nowadays neither is too hard to do, given that there are pretty good migration scripts which will work, unless you're doing something very non-standard and peculiar. I migrated close to a 100 repositories with not a single failure, though perhaps that's also because I don't mess around with Git LFS after being burnt whilst trying to sync GitLab and GitHub repos with LFS data going into a black hole.
> And the hooks; Then set up deployments, figuring out how to replace any missing Gitlab features
The good thing here in particular was the fact that I use Docker for builds as mentioned before, so getting "docker build -t ... -f ... ." working across different build systems is generally pretty easy (except for Jenkins which is needlessly hard sometimes), especially when the software for the CI runner nodes themselves runs in Docker containers (with trusted code you can also just share the Docker socket, otherwise you probably want a DinD setup).
Deployments are also just telling another system (Portainer/Docker Swarm in my case; sometimes Helm for Kubernetes) that it should retrieve the latest build artifacts from some OCI compatible repo and run the new version of the application with any given configuration.
> Then update the docs that describe how the repo structure works for onboarding
I don't think that I've ever actually needed to change this, since while the UI itself does change (e.g. organizations vs groups), the actual contents of a particular repository itself remain the same. Might need to update an URL or two, but for the most part I saw no blockers here.
> Moving infrastructure is a couple of days work at least for a small team. If you factor than cost into it then it's rarely worthwhile.
It took me less than a day, total. But maybe that's because I've built a bit of a pipeline around running containers in my infrastructure and have the ingress/reverse proxy setup around them all established. Not to sound cocky, there are also caveats and things that I do agree with you in regards to pain points (below).
> And that's before any time lost to investigating and fixing outages, updating the software, and so on.
This is probably the bigger issue here, though. All of the sudden, the stability of it all is your responsibility. Now, in my case that wasn't such a change in circumstances because I started out with GitLab and with my limited hardware setup moving over to Gitea and friends was actually a win from a resiliency standpoint vs an integrated GitLab Omnibus install (that loves to eat RAM). For for anyone moving from the cloud to an on-prem infrastructure, this is definitely one of the main concerns.
That said, for many out there storing their code and other stuff in the cloud is a non-starter (e.g. certain government orgs or enterprises), which is where self-hosting can indeed be very helpful, though in most cases I've seen GitLab instances be used instead of Gitea. That said, Gitea is also an excellent option for anyone who just wants to embrace self-hosting out of principle or other reasons.
Being in control of your data is pretty cool (at least until you misconfigure something and everyone is in control of your data). That said, if you don't buy into the balkanization of software platforms nowadays too much and focus on the common standards of the systems instead, overall it's not as painful as one might think.
Of course, you can also pick a platform that's too hard to actually use and administer (which is why I switched away from GitLab) by yourself, but then the question becomes of why you're even using it in the first place. Honestly, GitLab is an excellent platform with amazing features, but handling updates and its hardware requirements wasn't fun: https://blog.kronis.dev/everything%20is%20broken/gitlab-upda...
Oh, also sometimes migrating data can be downright hellish, like when I tried migrating SVN to Git, with migration scripts not working due to non-standard repo layouts, which meant that I had to rewrite the layouts and history on the SVN end so things would actually work properly and those scripts wouldn't hang. Though it was ages ago and I haven't used SVN that much recently, Tortoise SVN was a nice piece of software though, RIP.
Transferring all the administrative burden and operational risk whilst incurring the additional expense of SRE time makes the business case here look dead in the water.
Yes you could but that is a shift in responsibility. At that point we are responsible for keeping that infrastructure running. Any failure ties up internal resources. At the scale of 1 developer and 2 freelancers it doesn't make so much sense. There are far more valuable things we can do for the business than remove that $60 USD cost.
> $60 is like 30-60min at a developer’s hourly rate.
More like 4 hours of my salary (~2100 euros per month after taxes), given that I live in Eastern Europe and work for a local company.
I think that there is definitely an interesting shift in opinions towards expenditure towards software, platforms and tools. The poorer a particular nation (and its developers) are on average, the more inclined everyone is towards a mindset of "doing instead of buying", given how their own time is comparatively less valuable than paying someone else to do it.
Then again, the same applies to the software that they use, personally I pay for the Ultimate package of all the JetBrains products (their IDEs are really good), but a lot of my acquaintances only use free software, or other means of acquiring it. Kind of why a lot of Russia and other countries over there (if you look eastwards) is running on pirated Windows machines.
Curiously, that's also why something like AWS is out of reach for me and instead I use local platforms like Time4VPS, or something like Hetzner/Scaleway/Contabo.
Makes sense. I am from an even poorer ex-soviet country. I think we have zero (or very close to zero) subscriptions, use no external services for anything critical, and host everything ourselves. It's not only much cheaper, but faster as well (when the closest datacenter is 150 ms away, the difference is obvious). It also protects us in case the dictator does something stupid and we get cut off because of sanctions or inability to pay. I've seen an even stronger drive towards self-hosting since February of this year, just in case.
Yep. We use it self-hosted but the only reason for us to pay would be to import groups (instead of just users) from LDAP. Hate to say that (and basic stuff like enforced approval flows) isn't worth $19/user/month...
Unfortunately, they know this, and as a result they've actually been making more and more user-hostile changes (like bumping features up to higher plans, introducing virtually all of their new features as premium, and just generally ignoring their supposed pricing and monetization strategy in favor of $$$).
The annualised billing winds me up - I wanted to build an integration with GitLab recently and needed group level webhooks. ~$20 / month but I'd have to pay for a full year upfront that I'd then not use.
and here I was wondering when people would build a scheduled pipeline job to commit to a repo every month to keep it active, costing them even more money.
This comment chain has not received any replies in the last 4 hours and is therefore now marked as stale. Reply with "!stale no" to remove the flag, otherwise the comment chain will be deleted in 8 hours.
I think what they were trying to plan is fine. Maybe offering free repos is a loss leader in competition to the likes of GitHub, but useful services should have a direct cost attached to it.
I have a bunch of repos on there I haven't touched for years I was thinking of archiving them on my home server instead and maybe setup cgit or something for a web interface if it mattered. I don't really need 80% of GitLab features (or GitHub for that matter). It might be nice if someone sees it because of the navigability of these platforms and the visibility it affords. However, that hasn't happened. If they are saying that it has a cost, I can might as well self-host it for myself ... or not.
When Google Code was closing, GitLab had a project to support quick migration from GC to GL. Most of the projects I copied were dormant for years, and then someone forked some of them months later and worked on them.
“Today we have the SaaS Free User Efficiency - Data Retention Policy in the internal handbook for our team members. legal has requested that we add a full definition for the 'last_activity_at' attribute.”
Throw in a feature like epics or roadmaps and some CI time for $5/mo and I would have been on it, but $20/mo is too much for how few of the premium features I'd use. I'd just always assumed this tier didn't exist because the opportunity lost with long tail consumers was weighed up against the possible of business users downgrading to a lower tier. But then they could just go with the jetbrains model, and have the exact same tier have two different prices depending on whether the buyer is an individual or a business, if they're worried about that. Call it Gitlab Premium Personal or something.
That said, the youtube-dl github drama did push me into self hosting my private projects and mirrors of at-risk projects on Gitea, so that window has closed now. Also Gitea is a lot nicer than Gogs was the previous time I compared gitlab against other options. But gitlab is still for now my host for public projects.
I second that. I could probably convince my manager to pay $5 per person even if we get no extra features. It's too hard a sell for $19. I think they could do a lot to milk their current free tier users.
We buy compute time periodically and that is a small amount I guess.
We happily pay Bitbucket for our backup storage. I'm sure we'd pay for our code storage if that was required.
> I think they could do a lot to milk their current free tier users.
Maybe $5/person/month is not a lot for active projects a business relies upon, but the 90% of hobby and (semi-) dormant projects would be deleted or moved to GitHub if Gitlab tried to ask money for them. Which would reduce the attractivity of the platform.
Instead of arguing "we need to cut costs", why don't they work on an entry-level paid tier? Seriously, it is like they're not even trying to compete with GH at this level.
I have a few personal projects on Gitlab. Mostly I use them because my employer uses the self-hosted version. Their products are propriatery and probably need to be stored internally for compliance reasons. I thought the self-hosted product was their USP-- believed it enough to buy a few hundred bucks of GTLB shares (-40% :sigh:)
The most successful of my projects has a potential audience of a few hundred users-- custom firmware for specific obscure hardware. The others, the likely user base is likely one.
I suspect that's what a lot of hobbyist projects are going to be-- an itch scratched for a single user and/or a very small community, perhaps cast into the void with the thought "someone else might find some value in it" or "this helps as a portfolio for future job interviews-- lets me showcase technologies I don't work on 40 hours a week"
The thing is, this means that there's a fair chance you're going to reach endgame. You don't need any more features because your itch is scratched. There will be no new commits, but it's nice to have the archive for when someone actually needs the code. So activity is an awful proxy for value here.
can we just normalize software being done and finished?
there are plenty of things that don't require dependencies and don't encounter breaking changes. the javascript community's problems, attention deficit and hyperactivity disorders don't represent software development.
IMO Git is already quite resilient to this kind of breakdown. Even if your host shuts down, everyone who has a clone has the development history and can become the host, continue with another host, accept patches over email...or patches on paper napkins.
The main problem with services like GitLab is that they are not only hosts to repositories. They're issue trackers, they're CI/CD platforms, they're release artifact hosts, they're discussion platforms, they're review platforms, they're their own bespoke pull/merge request workflow; all eggs in one basket "as a service". If they shut down your repo, the code is the last thing to worry about unless they have the only copy of the current canonical history.
Federation is under development by some smaller git server projects, like Gitea. It should provide a good middle ground on the spectrum of decentralization.
Check out progress at the latest Forge Friends monthly report:
Well, for a moment there I thought we had another Google Code on our hands. I’m all for having multiple providers and such, but every time I find a really useful project on Gitlab I mirror it to my home Gitea instance lest it vanish.
a peer-based possibility to search in code. That's direly needed. Currently all takes on such are centralised, be it Microsoft's github, gitlab, STOF whatnot.
I once thought DOAP might be a thing, but a search engine bringing up code results (needs not be constrained to it) would be really nice.
I understand their intentions for doing this - i don't agree with them. My initial reaction was 'sleep mode' may be better and alert people to either log in and "activate" their repo.
Once again, i don't agree with deleting dormant projects, I'm just curious what an alternative solution is?
> My initial reaction was 'sleep mode' may be better and alert people to either log in and "activate" their repo.
That's what they ended up deciding to do
> We reached a decision to move unused repos to object storage.
Once implemented, they will still be accessible but take a bit longer to access after a long period of inactivity.
Is saving static data really a huge cost sink for them? Seems kind of weird to have a business model of storing and handling highly dynamic data only to come down hard on the easiest use case of the product.
There may have been reasons to do this in the past (e.g. lack of integrated CI) but I'm not convinced those arguments really hold anymore in 2022. If you're a maintainer and prioritising interoperability and the lowest barrier to entry for potential contributors and you don't care about it being proprietary/owned by MSFT you pick GitHub. If you're prioritising a FOSS stack, good tooling offering and low-weight pages with excellent performance, you pick something like SourceHut.
For everyone reliant on software published by maintainers who have already made this decision historically, it's reassuring the right decision has been made here, but a bit worrying they were needing to consider a step this far in the first place. Combined with their poor pay for engineers (discussed to death here on HN already), it does make you think a bit about their finances.