For people wondering what makes GitLab any different, the answer is that GitLab is an open source product at its core. This means that anybody can run their own instance. If the company ends up moving in a direction that the community isn’t comfortable with, then it’s always possible to fork it.
There’s also a proposal https://gitlab.com/gitlab-org/gitlab-ee/issues/4517 to support federation between GitLab instances. With this approach there wouldn’t even be a need for a single central hub. One of the main advantages of Git is that it’s a decentralized system, and it’s somewhat ironic that GitHub constitutes a single point of failure.
In theory this could work similarly to the way Mastodon works currently. Individuals and organizations could setup GitLab servers that would federate between each other. This could allow searching for repos across the federation, tagging issues across projects on different instances, and potentially fail over if instances mirror content. With this approach you wouldn’t be relying on a single provider to host everybody’s projects in one place.
GitLab is a lot like Firefox, or "Linux on the desktop" in that way. It's what a lot of us want to use, but the less-open but more-polished option has always seemed the more pragmatic choice.
But that can change.
Recently (just as most of the world has apparently moved on from desktop computing, haha) Linux is pretty much fine for the traditional desktop computing. I have current Mac, Windows, and Ubuttnu on my desk, and they are essentially interchangeable except for a few special-case purposes (say, Final Cut video editing, or opening one of those weird wonky Excel sheets that only open on Windows).
Firefox, too, is suddenly performant and I've switched to it as my main browser (for default website use) — something absolutely, utterly unimaginable two years ago.
I hope that GitLab is reaching the same kind of transition point. I've heard horror stories from people that used it 2-3 years ago and are happy to be back on GitHub. But I don't hear much recent grumbling. I moved a toy project to it and it seems nice. As fast as GitHub for me (though I am in Japan, and GitHub is slow in Japan). More features. A nice, sort of adult/professional aesthetic. And — yay! — the open source core it's always had.
I might be wrong, since I haven't used it for reals, but it looks like they might have hit that critical usability threshold?
Open source and worse isn't a very compelling sales pitch. But an open source tool that is broadly equivalent to a closed source one is generally more attractive, especially when you're talking about services that will be used as part of your core infrastructure.
As a faithful Fx user since 1.5, I'm not sure about Firefox analogy.
Its market share was actually much higher in the past and only went down (the introduction of Quantum didn't really help market share wise).
And in term of polishness.. Firefox was MILES ahead of any browser for a very long time. When Chrome launched, it was as horrible as you can think in term of functionality, and Firefox at that time was already as solid as today. But Chrome advances very fast. People often attribute Chrome's success (merely) to Google's push, but Chrome's technological development plays a bigger role IMO.
I'd argue Firefox was easily far more fleshed out, but Chrome was simply faster when it first came out, and I made the switch. It turned out all those features didn't mean jack when a faster option was out there. Of course, Chrome has added a lot of features now... and Firefox has gotten vastly faster than it used to be.
When I left Chrome a few years back, the sluggishness of Firefox was quite palpable. I eventually moved to Edge, which, while crashier, overall performed better. But after both the Electrolysis project, and Quantum, Firefox won me back quite solidly.
Long story short: Speed > features when it comes to web browsers.
I share the same thought. I use Edge on Windows when it came out simply because it feels so much faster and smoother. However, it has such a high tendency to glitch out and freeze sometimes I wonder how can Microsoft release such a glitchy browser and expect users to use it as their main browser.
In 1998 I purchased a Windows computer for the first time in 14 years. My thoughts were "They've had 10 years. Surely they've worked their bugs out". As it turned out, Windows 98 and Internet Explorer 4 actually did have some stability problems, to put it mildly. So I would say that releasing very buggy software without blinking is not a new concept to Microsoft.
> "They've had 10 years. Surely they've worked their bugs out".
That's quite a weird thing to think in 1998 with respect to a browser, given that they'd only had any sort of a browser for less than 3 years at that point.
The question was posed "I wonder how can Microsoft release such a glitchy browser and expect users to use it as their main browser."
To which I gave a related anecdote and responded
"So I would say that releasing very buggy software without blinking is not a new concept to Microsoft."
Since they released an entire OS that was massively unstable (win98), while claiming their browser was an integral part of it, it doesn't surprise me that they would release a buggy browser at a later date and expect people to use it.
I wasn't only addressing browsers, but rather my perception of the general quality history of Microsoft products at that time. It affected my perception of Internet Explorer, which at that time was not as popular as Netscape Navigator.
When Edge actually works, it's wickedly fluid. That's IF it actually works though. I haven't been using Windows much lately to know the state of Microsoft Edge
It usually works. When it doesn't, it really doesn't though. And there's some annoying quirks: For instance, Edge always restores tabs after a crash, and you have to literally create nonexistent registry keys to disable that functionality.
The reason this is a problem is because when a malicious webpage hijacks your browser, and you have to forcibly close it to escape... Edge helpfully reopens the malicious webpage each time you relaunch Edge until you find out you need to hack on the registry to fix it.
I often try to use Firefox but there are certain things that just became irritating...that were entirely Google's fault. Like Google Meet ONLY working on Chrome.
You click enough links in Slack that open in Firefox, don't work and then you have to C&P the meet link into Chrome...eventually your commitment wears down and you just start using Chrome as the default again.
And I hate that. It's a very IE6 type move on Google's behalf. Short of applications / my system giving me a clear way to say "always open links on this domain in this browser" it makes the workflow a pain.
Maybe the Facebook stuff will make that type of thing more popular. If I could make Firefox my default browser, but always open Facebook and Google links in Chrome I'd be pretty happy. Currently running Linux Mint.
Chrome made a big thing about unit testing at the beginning of their development. I'm not sure if their definition and my definition of "unit test" matches up, or even if they still have that drive, but I've often wondered how much that has made a difference in their development process. My own experience has been that an early commitment to unit testing can make a massive difference to sustained development pace (pays off more as time goes on). My inner confirmation bias would love it to be true :-)
The interesting thing is (and without meaning to denigrate your point of view) I actually think that automated regression testing is not particularly valuable when you have unit testing. I actually want unit testing for the ability to help you reason about the code. One of the things that I would dearly like to discover is whether or not my view holds any water. It certainly feels that way from my perspective, but I'm biased ;-)
However, I think you are probably correct in that I think it is far more likely that Chrome has a good automated regression test suite than it has a good unit test suite.
Libraries benefit very heavily from unit tests, built applications or frameworks derive more benefit from regression tests.
For a small unit of code, or a library, the unit tests effectively prove that the code/library does what it says on the box.
For a continuously worked on application, regression tests holds the guarantee that the application continues to work correctly for whatever thousands or millions of use cases built up over its lifetime - even when the implementations, algorithms and libraries used change underneath.
Good regression testing is a nice way to keep unit tests honest. If you never catch anything with your unit tests but boat loads in regression testing - you are inclined to pause and ponder.
I actually test my unit tests by changing the behaviour and measuring if the tests fail :-). A great metric is fuzzing the behaviour and measure the number of unit tests that fail. If it's a lot, then you can improve your unit test game :-)
The problem with unit tests is that they test things you know are problematic. The larger issues is then ones you were never aware could be a problem in the first place.
I highly recommend reading Michael Feather's Working Effectively with Legacy Code. He has the best description of unit tests that I've seen. Briefly, he describes it like a clamp. When you are working on a wood working project, you clamp part of your project so that it doesn't move. Then you work on the bit you are interested in. Later you clamp that part and work on another part. The purpose of unit tests is not to test the behaviour (unfortunate nomenclature aside) -- it's to immobilise it. This allows you to work on another part of the the system and be alerted if you've caused something to slip.
Acceptance tests are incredibly important. They tell you if the system is working. No amount of unit tests are going to help you with that. Once you have accepted the behaviour, what you're really interested in is whether or not the behaviour has changed. You do not need your acceptance tests for that -- your unit tests will tell you.
I'll write it a bit more concisely because I think it is important: acceptance tests tell you whether or not the code is working correctly. Unit tests tell you whether or not the code is doing the same thing it was doing the last time you ran the tests.
The reason I don't favour a large suite of acceptance tests is because they are unwieldy. It's fine for a few months, but once you get a few tens of thousands of lines of code, you will end up with a lot of acceptance tests. These acceptance tests are extremely hard to refactor. It's extremely hard to remove duplication. Over time, they get more and more problematic until you are spending more time trying to figure out how to make your acceptance tests pass than you are trying to figure out how to make your production changes.
Unit tests, when written in specific ways, have less problem with this. Some people think about a "unit" as being a class. But really a "unit" is anything that you might want to isolate in your clamp. It can be a function. It can be a class. It can be a module. Your unit tests should probe the behaviour in the unit (and by "probe" I mean, expose the internal state). Michael Feather's has a great analogy of a "seam" which runs through your code. You try to find (or make) that seam and you insert probes to show you the state in various circumstances.
IMHO, you should write unit tests the exact same way you write any code. Your "circumstances" (or scenarios, I guess) consist of creating the data structures to give your initial state. Your "tests" consist of probing the state along the seams and comparing it to expected values. This is simple code. You should be able to maintain this code using the same tools you use to maintain any code. You should write functions. You should write classes. You should write modules. You should use all the tricks of your trade to reduce the complexity of your "test" code. Your goal is to create specificity when tests "fail" (the probe detects behaviour different than your expectation -- or the clamp detects that your wood has slipped). When behaviour changes, only a few tests (ideally one) should "fail". It should report the "failure" in a way that immediately describes the difference between the state you expected and the state that you probed. It should be easy to change the probe when the behaviour is intentionally changed (ideally changing only one place). It should be easy to probe new behaviour (just build your data and add an expectation). Finally, it should be easy to reason about the behaviour of the code by reading the "tests". Refactoring your tests and removing duplication is very important here.
As for acceptance tests, like I said, they are incredibly important. What I don't find particularly useful is a large suite of regression acceptance tests. The unit tests already tell me when the behaviour has changed. When written well, they even tell me exactly where in the code the behaviour "slipped". I often write manual acceptance tests. Once I have tested it, it is not necessary to test it again (as long as I have a good unit test suite).
My personal opinion as to why people find automated acceptance tests suites important is because they have never worked with a good unit test suite before. There is a general lack of experience in the industry with these concepts. Quite a few people's experience with well tested code is with green field projects. Often these people leave after a year or so. It's not until you have a lot of experience working with the legacy of various testing techniques that you can understand the advantages and disadvantages. I think this is why Michael Feathers is so respected -- as far as I can tell he specialises in legacy code.
Having said all that (and I'll be surprised if you make it to the bottom :-) ), I do value a small automated acceptance test suite. It's my canary in the mine. If it ever fails, then I know I've really stuffed something up and I launch an immediate investigation. Also, there are some things that can't be unit tested effectively (for example testing a web application across both client and server) -- you end up faking the boundaries, which leads to the possibility of skewing. Again, in those cases, I try to find a few end to end tests that will hit the majority possible problems.
I hope you found that interesting. I've typed up essentially this same message in at least 10 different threads of the past couple of years. I think it's slowly getting better, but I think I still haven't managed to explain the concepts as well as they need to be explained. If you've made it this far, thanks for reading :-)
I'm not sure if it covers the use case you're looking for, but they've already introduced the API's necessary in Nightly (or Beta?) that allows the Panorama Tabs extension to be re-introduced :)
I have been using gitlab heavily for the last year and since then nothing terrible has happened. The worst thing was for one day it was pretty slow but was fixed the next day and no data was lost.
GitLab today in my opinion is now a better piece of software than github.
I've been using linux on the desktop for over 15 years with no intention to ever stop the near future.
I had been using firefox since its first release under the name phoenix then firebird then firefox (I had been using mozilla suite before that) and I've stopped with no intent to ever come back to it when mozilla finally killed what made firefox useful to me after a lengthy agony process (BTW the claim of firefox being not performant enough 2 years ago is totally unsubstantiated as I used it daily with over 250 tabs open concurrently without a single hiccup despite having 35 extensions loaded as they were required to put back the useful features mozilla had removed, to remove the unwanted cruft mozilla has added and to add the necessary features mozilla refused to add). I have now switch to waterfox, and its name says it all.
So really comparing gitlab to linux on the desktop means gitlab will never happen, and comparing gitlab to firefox means it will be mishandled into irrelevance by a shady finance operation aiming for market domination.
To me Gitlab seems a better alternative to proprietary and centralized github that will be bought at some point in the not so distant future, has been my stance on this matter. That Microsoft is the one buying would not have been my first bet but is not a huge surprise either considering their change of PR to jump on the opensource bandwagon as an attempt to extend their agony further.
> Recently (just as most of the world has apparently moved on from desktop computing, haha) Linux is pretty much fine for the traditional desktop computing.
It's been fine for the past decade, it's just the trope of "Linux on the desktop" that's slow to die.
Eh... it's mostly fine. It's just that when something doesn't work, you're screwed if you're not technically confident enough. Emphasis confident - you have to trust your instincts when digging through internet fora for a solution. Something that I can never see my mother doing, for one.
>Eh... it's mostly fine. It's just that when something doesn't work, you're screwed if you're not technically confident enough.
As opposed to Windows or OSX where you're just screwed.
There are rough edged in Linux on Desktop but people seem to be completely house-blind* about Windows and OS X. If you spend a lot of time on Linux and go back to Windows or OS X the rough edges in those platforms become immediately obvious.
* If English isn't your first language, house-blind is where you get so used to something being out of place that it becomes part of the decor. e.g. a jumper on the back of a chair that stays there for a week+.
But we have to put this in the context of an elderly lady who is terribly frightened of breaking the computer and always believes it's her fault when software screws up, because the feedback loop of computers is terrible (either absent, or opaque jargon, or marketing lies). That is, my mother.
And keep in mind that I live abroad. Helping her out remotely is a very difficult and slow process - if I lived close by the story would be different. In that context, when it comes to Windows or OSX, my mother has a lot of people who can help her out other than me - my sisters, my father (they're separated but still get along), some of her friends.
Now, my younger sisters are getting into programming (because all professions need it) so maybe I can get them into Linux too - they're definitely capable but the question is whether they consider it worth the investment. But still.
I'm not convinced. Chromebooks seem to be popular enough and simple enough for luddites. It's certainly a flavour of Linux; despite being a bit locked down afaict.
Maybe we're so used to expecting Year of the Linux Desktop to mean Year of the FOSS Linux Desktop that we ignore the successes.
> As opposed to Windows or OSX where you're just screwed.
Let’s not be disingenuous. I don’t know about Windows, but macOS has had absolutely wonderful critical failure recovery for a while now: There is the recovery partition, which acts like a mini-macOS and lets you do various things like drop into Terminal.app, use Disk Utility for drive scanning and repair, or do an ‘archive and install’ (extremely useful for the technically challenged) which keeps all your files but sets up a fresh macOS install. If even the recovery partition is borked you get the option of ‘Internet Recovery’, which connects to WiFi and automatically downloads and installs a fresh copy of macOS (with the aforementioned archive function, if an old install is detected).
Compare this to Linux, where you either get dropped into GRUB or a bare shell..
I have quit Microsoft for Linux 8 years ago. I do the remote administration of the computer I have installed for my old mother since 5 years. At home, there are 3 Linux computers, one of them has a dual boot to Microsoft. My cloud website is on baremetal Scaleway running ubuntu. During last year, I have spend more time for system administration on windows (that I use once a month to use an application during 5 minutes) than for administration of the 5 Linux systems.
If someone asks me to help for the administration of his Linux machine, I would accept because it is so easy and so little work compared to windows. I think Linux is perfect for the noob who accept to delegate administration.
I'm happy for you and your mom! But you cannot compare that to the situation of my mother - I'm trying to make her more confident but it is a very, very slow process.
It's not that I'm not willing to help, but I live abroad. If I lived close by, I would gladly install something like Ubuntu or KDE Neon on her machine (probably Ubuntu though - the mainstream would make it easier for her to find things on her own).
Whenever I'm home I help her with her computer. The whole thing is very educational for me as an interaction designer as well. It often shows how modern interfaces make her feel like she is the dumb one, when honestly it's often the arrogant UI designers who think everyone is on board with modern UX paradigms. Or worse, abuse dark UI patterns for evil purposes.
Ubuntu gnome is more similar to the "simplicity" of windows XP than vista, win7 and win10. The changes of windows are too big and too fast for old users. My mother is still on ubuntu 14.04 to keep this stability.
Honestly, I just came home to see that somehow Bing search had installed itself over DuckDuckGo (my mom loved the search engine for the name alone), and that her Chrome browser had turned into a touch edition which hides the mouse cursor. And that was the least offensive change.
Trying to fix her Lenovo Yoga I had to navigate a forest of dark UI patterns, with pre-installed apps trying to trick me into sharing private data every step of the way.
I really fucking hate this user-hostile attitude that can only be explained by greed. I've "fixed" her computer one more time, but I think the next time I'll let her try a bootable Linux distro, and see if she likes it enough to be willing to give it a try.
"Fine"... that seems to have little to spare already.
I switched from Windows to Ubuntu last December, and it has given me a whole new appreciation for Windows. The polish (things just working well) of recent Windows versions is just amazing, in comparison to Gnome/Ubuntu.
PS. Will stick with Ubuntu though.
PPS. Gitlab is an awesome product and company.
>GitLab is a lot like Firefox, or "Linux on the desktop" in that way. It's what a lot of us want to use, but the less-open but more-polished option has always seemed the more pragmatic choice. But that can change.
I wonder if this move by GitHub is motivated by them seeing this writing on the wall.
a16z, Sequoia and friends got their liquidity event from $350+ M invested. Looks like a (probably $3.5+ B) 10x or better ROI, which should help VC’s IRR numbers.
Your numbers are a bit off. The fundings were done at roughly $750M and $2B valuations. Still going to be great for investors, but most likely not 10x or better.
Same here. The CI is easy to use and makes sense, though it lacks some features - for instance being able to automatically run manual jobs as soon as their predecessors complete. Now I have to wait (!) and then click so the manual job starts...
But all in all, good product, I hope they succeed!
I think GP means that you know that for this one pipeline related to commit X you want manual step Y to proceed once previous steps are ok, and you know it right now, so instead of waiting for previous steps to complete you want to trigger the step in a delayed fashion. Kind of like "merge this as soon as tests pass" on MRs.
> Linux is pretty much fine for the traditional desktop computing
Yeah, on the desktop things are getting better. But ... everybody is moving to the smartphone. And there things are getting worse. For example, my Banking app works only on 2 platforms, which are not open.
> Recently (just as most of the world has apparently moved on from desktop computing, haha) Linux is pretty much fine for the traditional desktop computing.
It's exactly because the world has moved away from desktop computing that Linux on the desktop has become viable: collaboration tools are increasingly web-based (or is at least web-enabled for the 80%-usecase), and those tools are exactly what tends to anchor an organisation on a single platform. These days, even Outlook has a perfectly usable web-interface that works fine in Chrome and Firefox.
Off topic but I couldn't get hibernate to work on Ubuntu Linux. I purchased a laptop with Linux pre-installed; but apparently, if a manufacturer claims to support Linux on a laptop, that does not include hibernate support. Now that we've reached the milestone of Linux on a desktop, I'm looking forward to the Year Of Linux On The Laptop.
People here are deluding themselves. Gitlab, without gobs of VC money and the promise of a big IPO payday, is an abandoned open-source project with a tiny fraction of the team that built the code. So yes, you can fork the code, but without the money and resources of the parent company, good luck keeping it up to date! Worst case, you get another mysql: neglected by an acquiring company, with lots of bureaucracy and infighting and IP tangles to slow things down.
Gitlab is essentially salting the earth for dev tool startups. I had my issues with Github, but at least they had built a business around a dev tool, behaved ethically and gave back generously, and so I wished them well. To see so many people dropping them for a fauxpen-source competitor whose primary selling point is “it’s free!” just makes me sad.
If you want nice things, you have to pay for them. If you aren’t, I guarantee you that someone else is, and they’re the ones with control.
> I had my issues with Github, but at least they had built a business around a dev tool ...
That's a strange claim given that in the current top story - of Microsoft buying Github - the following claim is made:
"The acquisition provides a way forward for San Francisco-based GitHub, which has been trying for nine months to find a new CEO and has yet to make a profit from its popular service ..."
Perhaps it comes down to your definition -- can something be non-profitable for a decade and still be called a business?
The only “axe” I have to grind is the one I clearly stated: Gitlab is salting the earth for dev tools. You can’t build a business competing with someone who is using VC money to give away their product. This is all going to end badly when the music stops.
”can something be non-profitable for a decade and still be called a business?”
This is such a bizarre talking point...do you honestly believe that Gitlab is a better business? Their model is “just like Github, but with even more stuff given away for free!” And let’s not forget that Github has to compete with Gitlab cannibalizing the low end of the market. I’m sure that hurts margins.
Someone has to pay those Gitlab engineers who are writing the bulk of the code. At the very least, as soon as the dumb money dries up, the velocity of development on Gitlab will drop like a rock. In the worst case, you’ll get an conflicted corporate hydra, like mysql.
I understand that you're claiming gitlab is salting the earth, but still don't understand why / how.
You write:
> You can’t build a business competing with someone who is using VC money to give away their product.
This is delightfully worded, given it could apply to both github and gitlab.
Remembering that github started in 2008, while gitlab.com started four years later (first commit to their codebase was 2011).
Github is running on $350m of VC funding.
In response to my question 'can you call a 10yo company that still isn't profitable, a "business"?', you avoided the question, called the matter bizarre, and tried to distract from the question by claiming github is a _better_ business.
Your claim that github has 'built a business and .. gave back generously' is also weird in that gitlab has released the source to their core product, but github hasn't. This also speaks against your claim that you're more likely to be abandoned if you commit to gitlab than github.
Finally, the idea that the 'low end of the market' is where all the money is does not match any other tech startup's experience, is belied by the pricing structure of both companies, and further invalidated by the fact that gitlab is not swimming in cash from their cornering of the frugal user segment.
And what that means is, yeah, either they keep burning $$$ every month and selling more of the control to VCs to feed the war chest until they maybe buy 2nd place, find an acquirer (and with that much ever-increasing VC control, a likely push), or yeah, layoffs will happen. Gitlab is extra interesting because their definition of innovation is biting off even more surface area (e.g., CI), and therefore even more burn.
Keep in mind.. all this says zero about how nice the product quality is or how friendly the people are. But just in the same way you don't get mad at what happens if you stick your hand in a lawn mower (https://www.youtube.com/watch?v=-zRN7XLCRhc&t=34m7s) ... there are financial forces at play from being a high-spending bottom feeder that are hard to escape. Possible, and I wish them luck, but that's a real bet.
AFAIK, Github went for growth. Gitlab went for cash flow. Gitlab is profitable and, imo, their product is comprehensively superior to Github.
>Keep in mind.. all this says zero about how nice the product quality is or how friendly the people are.
Then don't use the term bottom feeder since that means the people are making a shitty product with no ethics to really innovate. It says the people are shameful hacks and the quality of the product is bad.
In reality Gitlab is a better product and the people involved should be proud of their work.
I don't think their official statements match that? They say their fundraising approach is 2yr runway, which is only 6mo longer than the advice for a regular VC-backed startup, and they've been raising increasing amounts ~annually.
Based on that, having 275+ employees, and their stated IPO targets, I ran the numbers recently. My guess was their costs are ~$40M year (admirable: I expected way higher but they focus on non-US hires and pay only 50% percentile in _local_ markets: super low!). Likewise, their stated IPO and growth targets make me guess they make ~$20M/yr. So two different reasons to believe they're burning... ~$20M/yr. The positive thing for them, which they're not public about but I'd guess, is while they're probably growing OK in regular accounts (hard competition vs bitbucket, github, etc.), they're probably Super Great on retention + internal expansion, so net negative churn, compounding factors, etc. I think they _can_ stop hiring and let revenue catch up, though other forces take hold then: so it does look like they're on the classic growth-over-control VC treadmill (despite saying they're not), and will keep ceding control to VCs.
I think you may be correct and my information was out of date. According to the strategy documents that Gitlab publishes they seem to have changed direction towards growth via SaaS:
"""
During phase 2 there is a natural inclination to focus only on on-premises since we make all our money there. Having GitHub focus on SaaS instead of on-premises gave us a great opportunity to achieve phase 1. But GitHub was not wrong, they were early. When everyone was focused on video on demand Netflix focused on shipping DVD's by mail. Not because it was the future but because it was the biggest market. The biggest mistake they could have made was to stick with DVDs. Instead they leveraged the revenue generated with the DVDs to build the best video on demand service.
"""
The term bottom feeder refers to going after the "leftovers" that premium market leaders leave on the table: lower-paying, more demanding (e.g., requires open source), higher acquisition cost (closeted international markets), etc. Good B2B companies often raise prices as they deliver more value and build brand trust, and as they establish the market, bottom feeders will pop up and spot the missing chunks. However, they are forced to play catchup in terms of features and with less $ (or a LOT of VC $). Says nothing about being nice, smart, and high quality, just the market & financial pressures.
No label is ever 100% accurate, but a lot of that dynamic has played out here pretty clearly..
> In response to my question 'can you call a 10yo company that still isn't profitable, a "business"?'
Gitlab also likely runs at a loss. Gitlab has certainly never claimed to be profitable and some estimates are that as few as .1% of their customers pay for Gitlab.
> I understand that you're claiming gitlab is salting the earth, but still don't understand why / how.
It's pretty clear to me at least that neither Github nor Gitlab have sustainable business models. The OSS community is crazy to think that either business will continue to subsidize OSS development while losing millions of dollars a year. All of the anger against Github and the new "faith" in Gitlab is pure delusion. Both these companies subsidize OSS development while losing millions of dollars. This will go on until it stops. It certainly can't go on forever.
Personally I suspect the absolutely best thing to happen to both Github and Gitlab would be being bought out by real companies that heavily depend upon OSS and, you know, actually make money.
It came up before and now the chatter has started up again around Gitlab. I think it still makes a lot of sense for AWS to purchase Gitlab. There's a fundamental strategy alignment there (both Gitlab and Amazon aim to be a "one stop shop"), Gitlab offers the potential to lure a bunch of developers into the AWS platform with a free offering and, ultimately, Gitlab offers the same computational economics as other Amazon products because it is just another hosted product that requires a database. Wouldn't be surprised at all to see such a transaction in as little as 2-3 years.
Wouldn't a company like Gitlab be able to sustain a decent engineering team by just selling a few dozen top-tier subscriptions for their on-premises offering to top Fortune customers who are often still too afraid to have their crown jewels hosted in the cloud?
I would say gitlab is more closely aligned with Google, at least technically, with their auto DevOps targeting kubernetes, and Google cloud having the most 'turnkey' k8s offering.
”Your claim that github has 'built a business and .. gave back generously' is also weird in that gitlab has released the source to their core product, but github hasn't. This also speaks against your claim that you're more likely to be abandoned if you commit to gitlab than github.”
Uh...Gitlab is built upon libgit2, rugged and github-linguist. In other words, the core parts of Gitlab —
the ones that interact with git are built, maintained and open-sourced by GitHub. And these are just the obvious dependencies. Github people contribute heavily to open-source projects that most Ruby websites use.
If you’re going to fanboy all over the place, fine, but at least know what you’re talking about when you do it. And don’t try to weasel out of it by talking about “core products” —- without GitHub’s substantial technical contributions to the infrastructural code that interacts with git, it’s a safe bet thst Gitlab wouldn’t exist. That’s core.
> If you want nice things, you have to pay for them.
And I don't know how that fits in with people releasing / maintaining free software.
I responded to your first rant because you appeared to be 'going all fanboy' over github, declaring them a successful, superior business. I asked you if a company that hadn't turned a profit despite first mover advantage and a decade of trying could be termed a business ... and you weaselled out of that question.
> If you believe Github isn’t a business, then you’re going to be sorely disappointed by Gitlab, whose business model is worse.
The challenge discussing this with you is all your comments about Github are based around comparing them (favourably) to Gitlab.
> I'm done talking to you now.
This is a shame, as I'm consumed with curiosity on your take of today's news that Microsoft spent US$7b buying github.
From what you've described it sounds like they should have just cloned libgit2, rugged, and github-linguist, and rattled up their own gitlab over the weekend.
MySQL is a great example. Bought by Oracle, still a good product, but also forked by some big players as well as some open source groups. I'm sure it is still the most commonly used database on the web today and Mariadb and percona both maintain great MySQL forks as well.
The MariaDB story is a bit of a fuck you though - cries about how oracle will close mysql (which hasn’t happened) but then adopts a bullshit license for its own software.
But the damage is already done - people think MariaDB is some bastion of good intentions and open source software now, because they very rarely look deeper.
> The MariaDB story is a bit of a fuck you though - cries about how oracle will close mysql (which hasn’t happened) but then adopts a bullshit license for its own software.
What?
There was strong precedent for fearing what may happen with MySQL. Knowledge of what happened to Hudson, OES, OpenOffice, Solaris ... this would concern the stewards of any bit of software that got swallowed up by Oracle.
(Edit: Also I recall some worrying stories coming out from Monty and other key developers.)
What's this 'bullshit licence' that MariaDB has? I thought the source was (L)GPL all the way down?
I've looked up MariaDB MaxScale ... and found an optional / add-on product that is aimed at Enterprises, seems to require an Enterprise support licence for the Enterprise edition of MariaDB ... and I completely fail to see how any of this demonstrates that the 'MariaDB story is a bit of a fuck you'.
Basically - their formerly GPL proxy for doing HA deployments is suddenly not open source.
They can of course make this decision - it's their code to do with as they wish. But it's quite fucking rich for Monty to claim Oracle will close source MySQL, create a fork and company which then uses that fear to grow in popularity, only to do the very thing he accused oracle of doing: closing an open source product.
Edit:
Also, if you think only "enterprise" customers need database clusters that survive individual node's being offline, you're in for a big shock.
I personally have not and will never use MySQL again because Oracle owns them. That is a company where software goes to die. Plus their atrocious security record.
Indeed the most extraordinary story of the last ten years is how Google, Oracle, Redhat, Microsoft and Facebook have funded open-source software to tune of billions. This is likely the greatest act of charity the planet has ever known. And while a lot of holier-than-thou types (particularly here on HN) imagine these tech giants as not to be trusted or even the enemies of OSS, the numbers don't lie. Look closely at who actually funds and writes the vast majority of OSS and the same five companies pop up over and over and over...
> Indeed the most extraordinary story of the last ten years is how Google, Oracle, Redhat, Microsoft and Facebook have funded open-source software to tune of billions.
Definitely not the most extraordinary story over the last decade.
And trumped by IBM's famous first $1b spend on 'Linux' just shy of twenty years ago, and their subsequent announcement that they'd recouped that money within a year.
Coincidentally this speaks to your claim:
> This is likely the greatest act of charity the planet has ever known.
These guys aren't in it for the charity. There's doubtless plenty of positive PR spin from contributing to free software -- but don't mistake pragmatism or happenstance for altruism.
Your confused if you think just because one benefits from charitable actions that somehow invalidates them.
And IBM's contribution was, frankly, marketing. It does not compare to the volumes of high quality technology that the companies I mentioned have simply given away for free.
Many on HN and others are perhaps too close to it but I think people will look back upon this extraordinary corporate charity as a decisive event of the century.
> Your confused if you think just because one benefits from charitable actions that somehow invalidates them.
I think you're being overly charitable to think these tech corporations had charitable intentions when they contributed resources to tech projects that happened to improve their tech business prospects.
> And IBM's contribution was, frankly, marketing. It does not compare to the volumes of high quality technology that the companies I mentioned have simply given away for free.
Bizarre you didn't mention that up front when you named 'the big five contributors'.
On what do you base your bold claim that IBM's contribution was marketing, and the other corporations weren't?
> Many on HN and others are perhaps too close to it but I think people will look back upon this extraordinary corporate charity as a decisive event of the century.
IBM announced their first billion spend last century.
Maybe, we should ask ourselves first if it's a fair comparison. Amazon kept investing the profits into themselves. I don't think that is the case with Github though
People on GitHub is not just to sharing codes, but also to get in touch with each others. GitHub succeed because it's not only just a free online Git repo services, but also a developer + user community where you can put your code on, share it, and 'earn' stars & forks as feedback. And stars + forks can help you stand out in a job interview and many other occasions.
Bitbucket is another Git repo service, but it sort of failed to build it's community. Result? It received less attentions compare GitHub.
So, while GitLab is also trying to be 'yet-another' Git repo + you can host it on your own, the benefits of become community can't be ignored. And federation can help that by connecting all the GitLab instances together to form a bigger and global community.
Even better, the federation protocol itself can be an open-sourced public standard, so all the other Git repo software can implement that in their product. The potential is huge.
Unlike GitHub though Atlassian isn't just working on GitHub but many other tools for developers. This includes JIRA, Confluence and so on. GitHub is mostly focused on repositories. This is why devs had to write a letter to GitHub to tell them hey can you guys seriously work on the Issues system? With the resources of Microsoft it will not surprise me to see a much more open and capable GitHub.
Except that Atlassian aren't really working on Bitbucket any more. There's lots of issues with the system that are not being resolved. Last time I had a problem with Bitbucket I found they had known about it for months and done nothing, with no plan to do anything either.
But then, as I understand it, Bitbucket was an acquisition rather than a brainchild of the Atlassian team, so you can expect a certain amount of neglect.
Atlassian has a lot of people working hard on Bitbucket. They shipped pipelines, deployments, code aware search, new UI, git lfs, embedded Trello and a lot more while improving uptime and performance, getting soc2 type 2 certified and a massive amount of other stuff.
They might not be working on the bits you care about, but they're definitely working on it. I agree, a few more resources should be invested by companies to fixing bugs and not just adding new features instead.
> With the resources of Microsoft it will not surprise me to see a much more open and capable GitHub.
Devil's advocate: why would Microsoft invest in improving, say, the issues functionality of GitHub when it could instead integrate and push users towards its existing products and tools for project management, like SharePoint?
Because they develop some of their core open source products right on top of GitHub. Visual Studio Code, .NET Core (and dozens of core libraries for .NET / .NET Core) as well as the less popular things they work on in the open.
Yes, beacuse they we're losing out as web development exploded, and realized no one wanted to pay for an editor when there we so many great free alternatives. They did it out of necessity, not out of innovation. Just like everything they do when it comes to their "open source movement"
I think this needs to go even further. If federation is done using an open standard like ActivityPub, then any services using the protocol would be able to federate. At that point it wouldn't matter what people are running, they'd all be able to talk to each other.
I've been reading into Activity Pub over the last week, and I don't think this would really be the case, (but I'm happy to be wrong).
For example, if a Gitlab instance posts a Pull Request object over to a Mastodon instance, what is the latter supposed to do with that? presumably it won't have any UI widgets to display the content, and no way of acting on it semantically.
As far as I can tell, Activity Pub is a way of federating instances of the same application, with the same semantics. But on the internet I'm seeing some dialogue which seems to presume that AP would make it possible to federate instances of all sorts of different applications and have them all Just Work™
(Apologies if I've misunderstood your post, I'm kinda rambling at this point)
ActivityPub is a very loose spec, and obviously it doesn't just magically work across random services. You would have to implement support for the specific types of notifications for them to be meaningful. An example of this is PeerTube federation with Mastdon as seen here https://peertube.cpy.re/videos/watch/da2b08d4-a242-4170-b32a...
For example, GitLab projects could publish their activity feed to Mastdon. You could follow it to see what commits they make, issue discussions, and so on. Meanwhile, federating things like pull requests would happen across Git based services. So, if Gogs decides to implement compatible ActivityPub protocol, then it could integrate into the federation of GitLab servers.
Yes, federation would be great for git. I just hope that an open protocol is designed rather than something GitLab specifc. GitLab takes a lot of resources to self-host.
Would be nice if Microsoft gets onboard for such a protocol. Especially once they own GitHub. Would also be funny if Microsoft Open Sources GitHub much to everyone's surprise.
I think it's important to clarify that even though GitLab's core is open source, its own business model is built on the idea of an enterprise version that includes proprietary code.
Many useful "advanced" features, such as "squash and merge" are only included in the EE.
In any case, this highlights once again the issue: the community will have to put enough pressure on the parent company to release certain improvements as Open Source. And their interests are most likely not aligned.
But doesn't the fact that the feature you mentioned has actually been made open sourced, that those business models might well be aligned? IIRC, GitLab's open sourcing policy is something along the lines of "if it's useful to small or open source projects, we open source it, whereas if it's primarily useful to large enterprises with >100 employees, it goes into enterprise edition". That sounds like a pretty viable way to both have a useful open source project and raise the money required to build and maintain it.
If they don't it can always be forked though. I think that's why they eased up - they realised that GNOME would just fork GitLab if they didn't release this issue.
GNOME already planned to self-host and have enough of a community to maintain a close fork.
I think the bigger picture here is code hosting. You cannot fork that, and TBH the value of moving from Github to gitlab.com is hosting alternative which is not really open-source.
I've setup gitea for myself. Both a fairly similar, just that gitea doesn't have a single point of failure. Gitea has a feature comparison chat on their site[1] if you wish to compare the two directly. The fork arose from the single maintainer not working to give some control to the community.
That is a pleasantly honest comparison table, even listing the features they don't (at least currently) have. A refreshing change not to need a third party assessment for that.
gogs was more of a one-person-in-command project, not sure its status now, gitea is more of a team work. I use gitea, works great all around, perfect for small to middle-size needs, totally resource light when you self host,a $5 VPS will serve well, or a small local VM
All these alternatives seem to use Github as the main code repository. That's a really bad smell. Why don't they host their codebase in their own service?
Yeah no, they really need another name. Bitbucket is known. Everybody talking about gitbucket will have to make it twice clear that it's not Atlassian's bitbucket.
For me, the one thing keeping me using Github for most of my public projects have so far been discoverability. I use Gitlab for all of my private projects.
To the extent that I think really the only thing I'd want to see to not need Github any more would be some sort of federated social/discoverability/search layer similar to what you're describing.
GitLab has taken VC money too. Including from a Google-connected investment group. There's no guarantee they won't be bought out etc. some day. They too are looking for a big exit for investors.
Their commitment to free/libre/open values is not 100% but is far more than GitHub. The fully-FLO CE part is truly a completely usable product. They have zero proprietary client-side JavaScript. They publish the source for their proprietary stuff (last I checked anyway) but have a restricted license. And they've actively worked with the FSF and volunteers to meet the GNU ethical repo criteria https://www.gnu.org/software/repo-criteria-evaluation.html
So, GitLab is not quite Mozilla (which is itself not quite Wikimedia or GNU level dedication). But GitLab is still a standout in FLO-commitment compared to the mediocre norm.
> The whole point is that some people do want the control that comes with self-hosting, and GitHub does not provide a way to do that.
But they do though. It might be significantly more expensive than GitLab and also only sold in packs of 10 user licenses and also not allowed to be run in a public-facing capacity. But they definitely do have a self-hosted option.
actually unless something recently changed, GitHub does, it's the enterprise edition and last time i checked it was like $10k a year. Yes, not reasonable for most people but it _is_ an option for people that want the control that comes with self hosting. Its feature set is also always lagging behind normal GitHub. The place i work used to use it (don't ask why). I think it's per-seat licenses but i remember it working out to ~$10k for us and we've got < 40 devs.
Easier and faster to evaluate gitlab.com and determine if they like the product or not before setting up a self-hosted version.
In fact I have been literally considering migrating our internal GoGS install to GitLab for the last week or two.
The end of my day on Friday was downloading Gitlab and figuring out where to host an evaluation install.
Migrating my personal account over from Github to GitLab.com is a good chance to get some hands on time. Plus I can consolidate my personal CI setups at the same time, and I don't have to pay monthly for private repos.
Win-Win-Win.
PS: Always interested when your content comes up in my RSS reader. I do wish it was easier to share links without the unsafe content warning though. :P
That sounds like something just waiting for copyright abuse and its ensuing enforcement hell. "This GitLab instance has been shut down permanently by the FBI and ICE"
GitLab's recent performance has been abysmal. We recently moved from a self-hosted git solution to GitLab, and while the CI, 'namespacing' and issue tracking are truly great and well thought of, we've had entire days where the team was unable to deploy because the CI workers did not run (even though we host the workers), and therefore the artifacts for deployment were never generated. And nearly every day, pushes take minutes to complete, as opposed to a few seconds with GitHub.
If anything, I hope that Microsoft's acquisiton of GitHub means that GitHub is going to keep growing in features for varied enterprise uses, and that we're going to see even more competition in this area.
I'm sorry that you had a bad experience with GitLab.com self-hosted runners and pushes. I can't place the CI runners not working entire days. Pushes to GitLab.com should not take minutes. They do take longer then to GitHub.com and we're working on performance improvements, including deprecating NFS for Gitaly and more performant size checks that just got merged.
A big problem seems to be stability/error reporting and averaging of statistics. I've frequently had the following experience:
- I can't push or something in general goes wrong with one of my repos (but not others).
- Gitlab's status page is green
- Other people are having issues on Twitter and tweeting @gitlabstatus about it but there is not general across-the-board outage
This seems to indicate that Gitlab tolerates (and very often has) a reasonable amount of instability and error rates across its platform, but just takes the average of these as a baseline of performance: i.e. it's a very spikey graph with a reasonably high average line fit.
"Errors should be down to normal" - the idea that there is an non-zero error rate that is openly described as "normal" is worrying. Not that I'd expect a constant zero error rate, but at least aiming for it should be a consideration.
It sounds like you've ever worked on a global scale service.
Services at this scale will have errors for all sorts of strange reasons, it doesn't mean the service is poorly engineered. In fact, if users don't notice these problems it usually means the service is resilient and robust when it encounters strange situations.
Consider a really simply example such as making a breaking API change to your service API. Now what happens when a user doesn't refresh their web browser and continues running javascript that doesn't work against new API. This can happen with smaller services but the odds of this happening are much higher when you are a global scale.
There are other strange problems that come with large services which means all components should be fault tolerant if possible.
You’re conflating two separate things: internal and user-visible errors. While it’s true that errors are inevitable, robust systems try to handle the latter gracefully with minimal disruption. If the person you replied to is accurately describing their experience a system which has significant unrecovered user-visible errors which aren’t acknowledged has serious robustness issues.
Also, please don’t make disparaging comments about other people’s experience unless it’s highky relevant. It doesn’t add anything to the conversation and will likely derail the conversation.
OP's post indicates that the metrics are poorly engineered.
As per the really simple example: generally you'd be better off rolling out a second endpoint for the new api and then stop serving responses that use the old one. First this doesn't break everyone who had your page up, and second you can stop rollout safely if you find a problem with the new api.
> Services at this scale will have errors for all sorts of strange reasons, it doesn't mean the service is poorly engineered.
Of course, and as I said, zero errors is not a practicably achievable in this type of context. The issue is with metrics though: the idea of taking averages instead of looking at troughs is still problematic.
> In fact, if users don't notice these problems it usually means the service is resilient and robust when it encounters strange situations.
True. But in the case of Gitlab, users are noticing these problems. Constantly. It's just Gitlab's own metrics that could be (I've not done more than browsed their Grafana instance a bit, so my comment is generally a bit speculative) ignoring the problems because they're focused on averages instead of specifics or thresholds.
> Consider a really simply example ...
lallysingh has already pointed this out, but I'll reiterate that this is a very apt bad example. You're right that ideally components should be fault tolerant if possible, but frankly that's a big ask. Especially for highly-scaled services supporting many many components of various types - ensuring that all of those components are completely fault tolerant is much more difficult than simply ensuring the old API continues to operate for a grace period while the new one is served from elsewhere.
I think your example is apt, because it's indicative of a common excuse for bad engineering: the assumption that downtime or disruption is necessary because of necessary software upgrades/improvements and poorly planned orchestration.
Do you publicly document your performance improvements? It would be cool to have a chart showing time to push or something, and let people see that trend go down as you are working on it. It would inspire confidence. Like others have said, you have had dealbreaking performance issues for a long time now.
I like your idea. However, few performance problems are global. We have a public monitoring dashboard at https://monitor.gitlab.net/. Embedded in this dashboard are various metrics which will often show a drop in response time if we improve performance on a particular item. We usually find a page or set of pages that hit a particular bottleneck and improve that one point. Also, you will usually see mention of specific performance improvements in the changelog (https://gitlab.com/gitlab-org/gitlab-ce/raw/master/CHANGELOG...) and in our release blog posts.
Yea, I haven't used it myself, but the reports are that it works better than the original PromCache proxy. It's been on my TODO list for a while, but way lower on the priority list.
But you know, when the internet decides it's time for everyone to look at your site, some random new stuff might be better than serving 5xx all day. :-D
In this very minute my team is unable to deploy (and therefore accumulating blockings) because of issues with Gitlab. We have a plan on-hold to migrate off Gitlab (even though we just migrated to it!) and while I'd love to stay on Gitlab it's becoming very hard to justify.
Sorry to say it like this but you’ve been working on your performance problems for years now and you’re still at the same place. I think your problems run much deeper than that.
Their gitlab website is much faster than a year ago. A year ago I moved all my repos from GitHub to gitlab because I had to cut some personal costs. I remember it took a while to load pages when navigating around the site. A week or so ago I logged in for the first time in a Long Long time to setup a project to share with someone to test some ideas. I was surprised that I wasn’t waiting for pages to load. It was much faster than it used to be. Still room for improvement but I did notice it was much faster.
So while they still have improvements to make it would be a lie to say they haven’t improved at all.
Even with the influx of users due to this I was able to not only setup a repo, but push all code up, and deploy via GitLab CI all within minutes... Speed is very good. I don't notice a difference between it and GitHub.
> Also, I don't think GitLab has had a long downtime recently.
That mostly depends on whether you're using CI/CD I'd think, that's had some day-long outages/problems lately. Of course, GitHub doesn't even have it's own CI/CD, and GitLab's is amazingly flexible, so it's still the better product. But it'd be nice if it were more stable.
(Note: all this is on GitLab.com. If you self-host, it's presumably much better.)
I don't encounter all of them myself - it depends on what I'm working on and perhaps also the timezone. That said, April 26th was the most recent occurrence for me where I was very happy I wasn't in the middle of a production deployment that I would have had to roll back. See the status updates on that day on https://twitter.com/GitLabStatus
(I am using the free tier though, so this is more informative than that I'm complaining.)
I first tried migrating to GitLab when the public cloud first came out and abandoned it due to performance.
However, I re-valuated and did migrate about 2 years ago and it has been fine during that time. There have been a few hiccups, but not for more than an hour or so. I've had a team of 4-7 devs working in it all day for the last two years and we have not had performance problems. We run our own CI runners as well, and while the cloud runners do often have delays, I've never had issues with delays to my own runners unless they were all busy.
I love GitLab and it’s UI, but recently the performance of the hosted version is awful (not sure why - just being overloaded?).
In fact, even their own status page reflects it: https://status.gitlab.com/ - the current “project HTTP response time” is around one second which makes me cry when using the UI.
I wish them the best but would be moving to a competitor (or maybe a self-hosted GitLab) in the meantime until they sort it out.
Noticed this too recently, makes me more wary to import my projects there. Just browsing a repo is painfully slow. Although I've noticed the same on github for large source files or large repos.
It reduces the complexity of your ops environment. Not the OP, but we do the same thing (though not with GL). When you only have a couple of developers, it makes sense to keep everything in house because your cost is essentially a couple of hours keeping things up and running as well as having an extra development machine somewhere. When you are a large organisation it also makes sense because you have a whole bunch of ops people keeping things running. Somewhere in between there is an awkward point where you've got enough complexity that you'll need to hire an ops person to handle it, but you don't have the organisational infrastructure to deal with that hire. Outsourcing is actually less risky because you're essentially piggy backing on somebody else's large organisation. A single bad hire isn't going to sink you, for example.
What's the alternative? Writing your own CI/CD system from scratch? You're going to be relying on some external dependency for important things anyway, you just have to pick one that is dependable.
I'd think a happy medium would be using an open source CI system and testing new versions on a test server before deploying them to prod.
Then again, I come from a largely non-web background where external dependencies aren't just accepted as inescapable. I guess if your entire business is producing an add-on for some other company's web service (not saying yours is but many out there seem to be) then what's one more on the pile?
that's exactly what their CI is. An open source CI system that you can deploy on your own server (and plug to either gitlab.com or your self-hosted instance of Gitlab).
The focus is on long term freedom here, not occasional performance issues. People who are migrating projects right now to GitLab are implying that Microsoft will push the site's policies in unwanted directions. And the frogs who did not jump out in time will be boiled to death in the slowly heating water.
Sorry about that. The 502 was only on our public monitoring dashboard for a short time. GitLab.com itself is up and running and the monitoring dashboard is back online now.
It went down again for a short time. We're continuing to monitor and adjust resources as needed. We weren't expecting this traffic to our monitoring dashboard, but it's great that so many people are interested in taking a look.
It sounds like it's a separate, single instance. Definitely doesn't use the same infrastructure as gitlab.com itself (which is a good thing, since that's what it's monitoring), nor is it built to be scalable really. So, no great surprise that HN-level traffic overpowered the instance.
I do hope they're using Prometheus federation to expose this instance to the fickle internet and that they have one or more internal Prometheus instances that aren't directly queried by this instance. After all, that stuff is responsible for paging if something goes wrong in prod.
It is a separate instance from our internal one. We have a cron job that automatically copies the internal Grafana dashboards to the public one, so you still see exactly what we see.
We used to use Federation, but now we just have the public server scrape the same targets as our private one.
I was using GitLab for about a year until a few months ago. The reason was sluggishness and how power-hungry it was. My Gitea server consumes considerably less resources and is faster.
Many other reasons to be concerned about performance, but there's no evidence that they're withholding essential features like this from their free version.
The Sidekiq memory killer is enabled for both CE and EE by default with the Omnibus package. If you're seeing something different please let us know and we'll see what's going on.
The fact that it's acceptable to restart sidekiq instead of working on fixing the memory leaks in the first place is a perfect example everything that's wrong with software engineering in Ruby land.
Reminds me of the classic "The main Rails application that DHH created required restarting ~400 times/day. That’s a production application that can’t stay up for more than 4 minutes on average".
Ironically, the OP currently just waits at spinners forever for me, perhaps because so many people are trying to look at the graphs from the HN post (although one would think that page would rely on cached info and easily scale...)
A lot of this traffic is probably due to the opportunistic advertisement on GitLab's home page for non signed-in users: https://i.imgur.com/ZNikKcJ.png
I think that the topic 'privacy' isn't the main factor in this case. Most stuff on platforms like GitHub and GitLab are public anyway and shouldn't include any private stuff.
In my case i simply want to move since i really dislike microsoft and their business decisions. I simply don't want to be involved in any "direct" way with this company. Microsoft doesn't have much control over GitLab, just because they use their servers. But Microsoft will have a lot of control over GitHub.
One move from Microsoft that really disappointed me recently was to prioritize advertisement over user privacy, for example in Windows 10. I read big chunks of the terms and conditions myself and was horrified.
How were you burned by google wave? It was never a finished product and for the longest time (probably forever before being canceled) was a beta product.
imgur has figured out how and when it's safe to redirect PNG/JPG requests to a "JS blob" (of advertising), unfortunately. They tried to pull this a few months ago, completely bungled <img> embeds, and had to turn it off in a hurry. I think they've figured it out this time, sadly!
Time for a new image host... imgur has gone all high-level and "scale"-ey, it would seem (particularly with the new video with sound thing).
I was going to say something about toxicity, but this is sadly just a scaling problem. Now that sound - and competing with youtube - is the new "major consideration", just being a competent works-anywhere image host has been relegated to the region of rounding errors, so it doesn't matter in the same way if they get that right anymore.
Also, they're detecting that I'm on a mobile browser and forcing a redirect to a smaller version version of the file with _d appended to the filename. If this were a large file, I would not be able to see the full size without something like changing my user agent.
GitLab is awesome as a broke amateur dev with 0 way to really give back to GitLab the team was cool enough to send me a tshirt and stickers. Its come a long way since then but they seem to be awesome. I dont know if the msft acq is generally even a bad thing, but its great there are options either at GitLab.com or as a self-run service for people to change over to if they want. Congrats!
I highly recommend people consider self-hosting with phabricator. It's php-based, and if you have a ESXi or proxmox or digital ocean / linode instance, it's a very simple system to install & update, uses all the same LAMP stack everyone is used to, supports ssh/http based push/pull workflows, git,svn,mecurial repos, same type of authentication as github/gitlab/bitbucket, but biggest part of all, its PHP based, written to a very high standard through & through. Has the ability to scale insane amount of workers & cluster configurations. I've never had it once hiccup on me. Yes, if you import a project with hundred thousand+ commits, it will take a while to import, because it builds records for the entire commit history of the repo, it doesn't generate them the moment you view. So this will take time to back populate into the phabricator system, but once you have an imported repo, it has event system that is unlike any you may have ever used. It takes some time to get used to it, but organizations like CircleCI have a tutorial on how to integrate simple CI workflow using Harbomaster. https://circleci.com/docs/1.0/phabricator/ Also as of php7.x phabricator runs even faster than ever before. It also has it's own Trello Board style system, and it has a much more powerful issue system than gitlab/github. Tickets can have parent/children, and repos never specifically belong to any one project, but rather projects can have many repos. It also has integration with 2FA. It has a Question & Answers section, wiki, pre-code-commit audits. Lot of really big open source communities use it. A fairly large list of big companies using it are shown on: https://en.wikipedia.org/wiki/Phabricator
PHP applications are extremely easy to install and keep running, and Phabricator is probably the cream of the crop in terms of code quality. The devs have done an incredible job with it.
So the answer is "it can be" considered a positive thing.
You can write clean code in any language, and I've written some nice PHP, but the language itself has many fundamental design problems. For example, printf shouldn't have side-effects on a datetime object:
I run some OSS stuff on PHP, but in containers to keep it all isolated. Although things have gotten better with PHP7, knowing what I know about the language makes me hesitant to use it on any new project for anything except the most trivial systems.
That should definetly be fixed, but it's not something people will ever experience unless they try to write broken code by using undocumented features/sideeffects.
What exactly is it you know about the language that makes you hesitant to use it?
It's type system and automatic casting/comparing is a nightmare.
Although there are namespaces now, the idea of everything being in the global scope was insane (and still even with namespaces, much is still in the global scope).
mysql_real_escape_string
There are no type-safe comparisons for greater or less then (You have ===, but no <== or >==).
> it's a very simple system to install & update, uses all the same LAMP stack everyone is used to
I think that has not been true for a while now. Maybe it's just the world i live in, but putting a bunch of files into a directory on a vanilla LAMP configuration as a deployment scenario is not something i encounter a lot nowadays.
edit: specifically referring to the "everyone is used to" part.
I'd like to point out that GitLab is not independent. They are owned by investors, including Google Ventures. So anyone who thinks they're giving GitHub/MS the finger by migrating their repos, please be ready to move everything again when GitLab disappoints you.
And it's worth pointing out that you can run GitLab on your own server, so even if GitLab.com ever ends up disappointing you, you don't need to go through the trouble of migrating to the next best thing.
Windows is still doing a lot of bullshit with ads in the start menu and forced updates. A bunch of their open source stuff which has included spyware has been less than upfront about including spyware.
I think what happened to Skype is a case in point. It doesn't matter who's running Microsoft, after that disaster I don't think anyone's going to be trusting them.
Skype has basically become unusable for professional communication. It’s getting worse with every update. I kind of expect the same for Github then. That’s my concern.
I'm just wondering what the atmosphere will be like on Gitlab compared to Github.
Github in general to me seems very bubbly and emoji-filled, or to put it into a word - _fun_. For better or worse, it won't be the same. This is not to say that Gitlab would be bad, just that it would be different.
I'm just incredibly curious to see what it ends up looking like.
You know that feeling when someone makes a service you really enjoy and you know you can trust? GitLab is definetly one of those services.
Self hosted? Check!
Open source? Check!
Community Edition that include the features small-medium teams need? Check!
I'm a huge fan of GitLab, and with the current business model and how community editions work I'll definetly continue promoting GitLab every chance I get.
That's the trend for today. In two weeks the sentiment will be "Oh screw it -- this is more involved than we though... we'll stay on GitHub for now and wait and see."
In short: it's the developers' version of #DeleteFacebook
GitHub has something like 75 million repositories. GitLab seeing a spike of ~40k repos being imported so far would need to continue for 5 years to match GitHub's current scale.
I just assumed devs were testing it out; most devs love testing alternatives and even occasionally retesting them to see if the alternative has caught up to the de facto answer.
Gitlab is a 100% remote company so everything they do is communicated online and a huge chunk of the time they publish their communications and decision making. You can read all the things the considered while working out if they should remove the CLA for example and see live feeds of what is going on when they are trying to fix an issue.
I wonder if there are statistics on how much of that number is forks (which people use like clones) and how much of it totally abandoned or README only or empty repositories. And also which 3k repos are those is veeery important. Just a handful high-profile repos migrating can cause a much bigger wave.
Admit it. If Google bought them instead, there'd be far less freakout. Google is pretty scary, but you have to admit that we still distrust MS with our free software even less.
Not me. Microsoft has fair products for sale whereas Google just takes a run with your data. I will avoid either if possible, but trying to imagine both scenarios, I think I'd consider my options just as much with either news. (I'll probably move to Gitlab, self hosted if possible but I hear it's terribly slow.)
Using all of gitlab.com, our self-hosted instance, and github.com with some kind of regularity, I find both GH.com and GL.com equally fast (or "slow" depending on your PoV) since a year or so, and our local instance faster. Neither is rocket fast, but neither is "terribly slow" either, and at any rate they're largely in the same ballpark.
I don't think it's slow, but it is hungry, don't expect to run it on a Raspberry Pi or equivalent VM. Personally I think it's a fair tradeoff considering we use it fully (including CI, etc).
I can run a couple blogs, ftpd, mail server (smtp+starttls, imaps, pop3s), dns server, mysql, torrent server, a factorio server, etc. on rpi-equivalent hardware (some intel atom based mini desktop that uses 20W in power). What is the big deal with a git front-end? I'm a software developer and I've written software in an interpreted language (PHP and Python) that runs on that computer for that computer, and it's plenty fast for this. It might not be able to handle things like realtime video encoding and streaming, or other things involving crypto (I'd like to run a Tor relay but I can't), but a git front-end... that's not heavy.
Gitlab is not just a git frontend, it's a full development "suite". It includes Issues management, CI/CD server, deployment tools with integrated monitoring (using Prometheus), a chat server (Mattermost), a web-based editor, a container registry, and probably a bit more I'm not remembering.
Issues management is a script as heavy as a BBS system from the 80s. CI/CD totally depends on whatever you're building, so they can hardly include that in system requirements. Monitoring I don't use, so that shouldn't impact performance for me. Chat server is Mattermost, that's not a mandatory dependency I should hope? Anyway, wouldn't use it, so not applicable either. Container registry... a registry doesn't sound like much? But again, not applicable. I don't know why 4GB RAM is the absolute minimum for Gitlab.
All I want is a Github alternative: something which allows people to clone (be it http(s) or git smart something), do pull requests, have an issue tracker, view files online and show pretty markdown documents. I'm half inclined to go the usual route and write it myself, but I should first google around. Someone mentioned Gitea to me: that project claims to be lightweight so it might already fit the bill.
I distrust Google far more than I distrust Microsoft, I generally don't know why people see it otherwise, maybe it's simply because when they were surfing Slashdot in 2003 everyone told them to hate Microsoft and in their heart Google is still the "good guy." Imo, the roles have reversed.
It's not exactly about distrust, it's just clash of interests. Microsoft is literally personification of Closed-source software, them acquiring GitHub is surely an evil plan to take control over the open source area.
This is very true. Paid consumers as well... if I have a problem with a Microsoft product I can call an 800 number and talk to a human and generally get a resolution in a matter of hours at the max. Can't really do that with Google.
You certainly can get responsive support from Google - if you pay for it. we have a silver support contract with Google for Appengine and get good support when we need it.
I think the "if you pay for it" is the differentiator. And because Google supplies far more free services than Microsoft, they get a lot more complaints about non-existent support.
Google is a one shot wonder with search engine and ads as their 90+ revenue engine. Microsoft has been doing Enterprise and Devtools for decades. Google also has a habit of abandoning things without any warning. I can’t trust them with my life’s worth of code.
The "Google abandons everything" meme is getting very tired. Microsoft (and every other tech company) abandon things all the time. For example, Microsoft pulled the plug on an entire eco-system (Windows Phone) after promising developers it would be there for ever (and before that Silverlight). But for some reason Google shuttering a free online service (Reader) causes all the outrage.
Also your comment that Google abandons things without warning is so wide of the mark as to be plain wrong. We've used Appengine for over ten years, and while they do deprecate features, they give you three years warning (and usually a simple migration path to a superior service). They gave over a year's notice that they were shutting down Google Reader.
I would agree with your last point about trusting your code to Google... or anyone else for that matter. As the mass migration from GitHub to GitLab shows, it is easy to keep your code in multiple repositories. To rely on any single provider (even if that provider is yourself) is irresponsible.
Your probably right, but it's because Microsoft understand enterprises many others, like me, want nothing to with them. Because you are not the customer they want, however, they really try to make you think you are.
They seem to very deliberately have marketed to middle and upper management for decades. These are levels the majority of companies barely have! Most companies are small, the kind of companies GitHub is 'built' for.
I distrust Google more with privacy. I distrust Microsoft more with 1) not running newly acquired services into the ground, 2) not integrating it into their ecosystem until it's unusable from the outside, and 3) yes, with free software. This is still the company whose founder, Gates, and its former CEO, Ballmer, wanted to rid the world of free sofware. Yes, there's now a new CEO, but you can't tell me that this culture was at the top of the company for decades and then swapping out one person suddenly switches everything around.
Far be it from me to question the distrust, but I'd say in recent years, since around the time they started open sourcing ASP.NET a while back, they've been pretty decent.
I mean, some people dislike Windows for various reasons, but this falls more into their development umbrella so I trust it is in good hands as ASP.NET, Xamarin, etc. have been in recent years.
The claim misunderstands how software is different from services, so it's really a nonsensical statement. Also both Google and Microsoft distribute nonfree (proprietary, user-subjugating) software and spy on their users activities.
The freedom of free software one runs on their own computer means it doesn't matter who holds the copyright to the software. If the copyright holder does something you don't like you have a copy you are free to run, inspect, modify, and share.
Remotely hosted services are a different consideration than free software (see https://www.gnu.org/philosophy/network-services-arent-free-o... for more). With remotely hosted services the user doesn't typically have a copy of the service software. So free software concerns are for the service provider. We hope the service provider runs only free software for their own sake, they deserve to control their own computers just as we deserve to control our computers.
In the particular case of free software and discussions you will publish publicly, you should consider https://www.gnu.org/software/repo-criteria-evaluation.html -- the GNU Project's ethical repo criteria. This was written well before Microsoft was known to have any interest in buying GitHub, possibly during the time when Microsoft was chastising "open source" software for existing (and still despising strongly-copylefted free software) and before their current "Microsoft [heart] open source" publicity campaign. Thus there's no chance GitHub's failing evaluation is based any ad hominem such as whether Google is somehow scarier than Microsoft.
IMHO it's wise to distribute repos to GitLab and also to other hosting solutions beyond just GitHub.
One reason is because acquisitions can sometimes be tumultuous, such as because of employee changes, security changes, alerting changes, etc. A backup git host is prudent.
I value having a way for developers to push/pull changes to a well-known agreed-upon place, as well as to read/post/comment on pull requests, tag/branch/integrate for releases, and always-on always-updating availability.
Orgs that are my clients do not have default employee machines that are automatically ready to push/pull to each other.
This can be because of a range of technical areas e.g. employee machine firewalls, continuous integration server configurations, disk space storage size for large files, etc.
This can be because of a range of process areas e.g. a dev team wants to do a git flow that uses pull requests and code review comments, or a legal team wants an audit trail, or a management group wants a dashboard to do updates, etc.
It’s cool that there are alternatives, but github is still the only one providing a search engine over billions of lines of modern code. It’s an irreplaceable software development tool and I hope Microsoft treats it well.
At its current state, GitHub importer can import:
the repository description (GitLab 7.7+)
the Git repository data (GitLab 7.7+)
the issues (GitLab 7.7+)
the pull requests (GitLab 8.4+)
the wiki pages (GitLab 8.4+)
the milestones (GitLab 8.7+)
the labels (GitLab 8.7+)
the release note descriptions (GitLab 8.12+)
the pull request review comments (GitLab 10.2+)
the regular issue and pull request comments
References to pull requests and issues are preserved (GitLab 8.7+)
Repository public access is retained. If a repository is private in GitHub it will be created as private in GitLab as well.
Meanwhile GitLab is crashing... but Bitbucket is up. If you haven't used it in a while, probably worth another look. It's gotten a lot nicer in the last few years, and the code review and CI features are especially good.
IDK about bitbucket, maybe it's just me, but the page loads are like watching paint dry. So slow.
Don't get me started on their redesign, making everything blue doesn't magically make it usable. Code review using -u style diffs?! With huge embedded comment slabs! What were you thinking? After three comments, even a small function doesn't even fit one one page, absolutely painful and almost impossible to read the code.
It's not like it's very hard to do a visual diff, it has been done zillions of times. Heck I implemented a partial 'visual' diff in SQL once, should be easier than that when using a proper programming language.
Sorry about that. The link from this discussion is to our public monitoring site (separate from GitLab.com). GitLab.com itself it online and the monitoring dashboard is working again, too.
I seriously wonder if Fossil's model of keeping all issues in the clone isn't the way to go - set up a public "FossilHub" or some such thing if it doesn't already exist and then you get a local copy of everything, no worries about providers at all.
Fossil's model is pretty nice in my book, though since it's a much smaller community compared to other SCM options it has received far less polish overall. So, it has some awesome core ideas, but the implementation needs work to be competitive for most users.
No doubt the recent Microsoft takeover news has fueled the spike.
At MS heart is simply money. Nothing else.
When push-comes-to-shove and a decision in the future is between doing the right thing for the open source community, and making an extra buck you can count on MS to go with the latter every time.
I’ve been using the keybase.io git repos. It’s cool that it adds encryption. Unfortunately there’s no pull request system so it falls short on teams that depend on that.
Wow! Did not realise people hate/distrust Microsoft so much! Balmer’s Microsoft - sure, but I thought that under the current administration, they were seen as a lot more friendly company.
For those who haven't tried MS Windows recently, it now comes with tons of ads you have to opt out of. The settings to opt out are scattered around and unclear ("pipe up thoughts", "show suggestions", "show me what's new"):
They should be opt-in, not opt out. And opting-in should come with some benefits. For example, you can get Kindle cheaper if you can live with ads. I can, kind of, understand ads if the OS were free. But someone who bought the OS for full price shouldn't have to deal with them.
Well, for me this new Microsoft is even worse than Ballmer's Microsoft. Because their strategy and goals are the same, but they changed their PR to make people think otherwise.
The largest part of their strategy and goals that is the same is "make money". But to say there is no improvement or that it's even worse is just silly. There is little to no hostility anymore to open-source, and they're contributing quite a bit...
I did -- but IMO it is less outrageous (but outrageous nonetheless) than what the top three (Google, Amazon, Facebook) are doing. And there is always a certain amount of risk when you putting your data on someone else's servers.
Just to be clear, I use Windows only when I have to -- never by choice.
That said, my point is that even if Microsoft's acquisition of GitHub turns out to be a bad thing, I'd wait to see how things go before migrating en masse.
Can you elaborate? Among the giants (Apple, Google, Amazon, Facebook, Microsoft), I’d like to think Microsoft is the 4th on the least trustworthy scale - right ahead of Apple, and behind the other three.
Recent changes in UI and functionality made it made LinkedIn look like "we're worse facebook with more forced ads". Even reddit is not better place to find a new job and learn career-related things. I logged in there last week to be bombed with ads about "New Shiny Things In Java 8" or "Learn What's New In Java 9", well... Java8 will be out of support in September (it's LTS) and Java9 is already deprecated and not supported. I just logged in while writing this comment and there are 3 videos with sound with autoplay, one of them is "Trending for: Software Developer - Basics of C++". Seriously? 3 videos with sound at the same time while I'm only scrolling?
Gitea community is what Gogs community should have been in the first place. Gitea looks nice but there are no public instances. I only wish https://notabug.org would move to Gitea instead of sticking to Gogs...
Seems like a knee-jerk reaction. In reality it will probably be more than a year before anyone even notices any change to GitHub associated with the acquisition.
GitLab is so overloaded right now that the ngnix server is returning 503 Bad Gateway. Do they have the capacity to handle all the projects being moved over from Github?
We've had to do this before. Remember when Sourceforge tried putting adware into open source downloads? Many projects left Sourceforge then. That was tougher. Github to Gitlab is easy.
Microsoft has put adware and spyware in their operating system. We have to expect that they may try something like that with Github. They may say they won't, but that's non-binding and meaningless.
Sorry about that. Was the 503 on GitLab.com (main application) or on the link in this discussion (monitor.gitlab.net)? The latter is to a public monitoring dashboard. It has been offline a couple of times today due to influx of traffic. We're closely monitoring it and attempting to keep it online. At the moment, GitLab.com itself has been absorbing the additional load, though. Let us know if you're seeing other issues.
I really like how the devs from gitlab are active on the HN thread and replying to people facing problems! Hardly see that with mature products anymore! Great going guys!
Turnkey is a great way to get GitLab up and running. However, please note that the latest version of Turnkey is based on GitLab version 8.3 (https://www.turnkeylinux.org/updates/gitlab) which is nearly 2.5 years old. Check out http://about.gitlab.com/downloads for information on various installation methods which make it easy to install and keep GitLab updated.
I'd like to move over to Gitlab, but someone else has been sitting on my nick, "labster", for over two years with an empty group. I'm labster on HN, Github, IRC, forums, etc. but have no way of even contacting the other person to ask for the name. So for now, it looks like I'm stuck on Github.
... until the day arrives when a huge company buys out GitLab.
Hosting your precious code is always an issue, and doing it yourself with something like https://gitea.io/ is cheaper than Github etc. on a small VPS or even a RPI at home.
I really like GitLab and are rooting for them, but if my experience the last couple of weeks of exporting/importing projects is any indication, a ton of these imports will fail, or even worse, appear to succeed, but partly fail.
> Pushing this back to 11.1 since 11.0 is full with major release requirements.
If there's one way to lose new users, a good way to go about it is have their imports fail. My case was Gitlab -> Gitlab, hopefully their Github -> Gitlab is more resilient.
I have always wanted gitlab to become a viable alternative to github, but every time I have used their product I have come away disappointed. Currently trying to import a repo from github to try out their hosting ... 3hrs in and the import is still in progress.
Sorry for the delay. Due to the spike in imports things are a bit backed up. Your import should be in the queue and will run eventually. As someone else mentioned it can take a lot of time to import, in general, depending on the number of issues, PRs, etc. Contact @movingtogitlab if you continue to have trouble.
Yes, one of the many bugs there is the progress indicator. Sometimes it really does take a crazy amount of time. Most times, it has failed behind the scenes and you won't know unless you look at the logs. If you're running your own instance, check Admin->Monitoring->Logs->Sidekiq logs
Just watch as the crappy small scale hosted stuff moves to Gitlab while MSFT kills 50% of their on prem business migrating over to the new GitHub... MSFT have the channel, the customer base, the Azure Stack story (kinda), it's a no brainer for them.
Federated GitLab would be a cool leap for me to trust it, and a strong counter to what I'm about to say...
But man, I wish there was just an accepted, good UX federated system for hosting issues/etc. The solutions exploiting git metadata seem to be the most interesting to me, but then there's an issue of easily getting a single source of truth for a project, etc.
Git blockchain anyone? .. I cringe at the thought, but it sounds like I'm basically describing that lol.
If you're moving from GitHub to GitLab, create an issue if one doesn't already exist [0] and send me a message (here, @JobV on GitLab.com, Twitter). I will make sure it ends up in the product.
So what's the equivalent of github's unlimited users and private repos for a flat price? I don't care about any of the advanced features, I just want remote private repos (browseable from the web for convenience) and I'm willing to pay a bit so I don't have to maintain them. We don't even use the issue tracker, that one we self host.
Wow, GitLab.com offers users insight into and customization of the cookies they set! Checkout `div#CybotCookiebotDialog` on https://about.gitlab.com/. Never saw a website disclose and enable that before!
This is great and all, but I suspect in a year most modules, gems, etc will still be hosted on Github. So shake your fist and proclaim your distrust for Microsoft, but most developers will still have Github as a core dependency of their livelihood.
Gitlab is not ready for prime time, we tried setting up 2 weeks ago and were welcomed by several 500 errors. Our team was frustrated by the time we could on-board 4 members and ditched it.
I am very happy with the constant innovation that GitLab is offering. They have data centers avross the globe and were the first to run a DC in Asia. FYI GitHub still does not have a DC in Asia and UP/DOWN speeds are terrible. GH refuses to add DC in Asia for unknown reasons despite my many requests. One will think that we are in a cold war or something where most companies support only US and many years later try to expand. Remember this next time when you travel abroad and see how polite and welcoming other nations are. Software service must be the same way. Thank you for being politically correct GitLab and shame on you GitHub for being racist
Most of that is a valid complaint. The last sentence is nonsense. Github is not doing this for racial reasons, and I can't even picture how political correctness could possibly be involved for gitlab. (And when political correctness is a motivation to do something, that's a bad thing, because it means it's not a genuine action.)
GitLab is hosted on K8s, azure probably worked out to be the cheapest solution. If azure of any competitor change their prices, you just move k8s config to another provider withing a few hours. In one company I worked in this process was configured as jenkins job, just click on new job and it was migrating between azure, gcp, aws and ibm.
I certainly hope not. GitHub's purchase has not even been officially announced by Microsoft. I wouldn't trust a project that hurriedly uprooted their repo to move to the flavor of the day. Projects need stability and also need level-headed admins that contribute to that stability.
GitHub hosts open source projects for free and have so for a long time. What other reason should an OSS project move to another hosting platform, other than "ugh, Microsoft bad"?
Sorry about that. We had to scale our monitoring dashboard a bit. It's back up at the moment. GitLab.com is separate infrastructure, though, and we hope it remains online despite the added traffic :)
Complete and utter FUD, this is not the way GitLab operates - more importantly they are from the core an open source and transparent organisation that values both equally.
I don't know. So what? No different than a fully open source project that might be abandoned by its core developers. Either we fork or we move to another solution.
EDIT: It's not exactly an uncommon model; e.g. Nginx follows the same.
Has there been any word yet from the big tech companies like Google, Facebook, and the others on where they will move to?
I just hope they do not just do it themselves now. It was so nice basically having one site.
The entire software development culture had change and MS just had to try to be part of the fun with being about as tone death as you can be. Or they just do not care.
I love GitLab, and I use a private instance at work, but I won't use it to store my public projects for one sole reason: you can't hide your email address shown in commits.
This seems a little off-kilter. If you're that concerned, set a dummy email address in your "~/.gitconfig", or your project-specific ".git/config". I mean, I could just clone the repository if I wanted, and if I wanted to scrape it I could certainly do that too.
You can have multiple addresses associated with your GitLab as well, so you can add the dummy one to make sure your profile metadata still shows up.
You can't hide it on github either. You can just look at the commits directly, and I've used this to contact people in the past due to the "UI" hiding it. It's nothing more than the thinnest bit of obfuscation, and easily overcome.
It's 100% related, but I would also not read too much into the numbers. There will always be a small minority that hates Microsoft like no tomorrow and this is very much reflected on reddit (https://reddit.com/r/programming) right now.
If you trust Microsoft at all you don't remember the 90's at all. OTOH I'm happy that MS is doing well recently because competition is good for all of us, but there is nothing more terrifying to me than MS becoming even close to as powerful as it used to be.
I’m wondering if it’s a generational thing to some extent.
I’m very wary of Microsoft because their bad behaviour was front and center while I was learning about computers. I remember the lawsuits and the backhanders and the FUD and the Embrace, Extend, Extinguish philosophy.
But modern Microsoft is better behaved to some extent. There are still lots of concerns, but I could easily imagine that someone a little younger than me might have missed the controversy and think of Microsoft in much the same way others might think of Apple or Google.
> nothing more terrifying to me than MS becoming even close to as powerful as it used to be
I've written about this in length in the past, but Git ensures Microsoft cannot be as "evil" as you may think they can be/become. In the past Perforce, Microsoft, IBM, etc. were able to control the software development lifecycle, since they "controlled the spice" (watch Dune if you are confused). But with Git, everything has changed, and that's because Microsoft can't dictate Git's direction. In fact, Git forced Microsoft to change its way with TFS.
Microsoft acquiring GitHub can certainly be scary, but I think this is more so for GitLab, Bitbucket, and other Git hosting solutions. If Microsoft decides to treat GitHub as a loss leader, this can have a devastating affect on GitLab and Bitbucket. If Microsoft announces that private repos on GitHub are now free, I can see developers migrating from GitLab and Bitbucket to GitHub.
I posted this in another thread, but I can see Microsoft making GitHub free, so they can:
- capture developer data for ML/AI research
- identify new product ideas
- identify ways to improve existing products
- funnel developers to existing MS products
I've been in the developer tools space for quite some time and what I've noticed, is developers are not loyal. More often than not, they will pick a solution based on productivity and cost, instead of ideology. And if Microsoft can remove the cost variable, this can be very scary for GitLab and Bitbucket. This can also be very good for developers as well, since GitLab and Bitbucket will be forced to focus on innovation, which benefits all.
Those of us who remember the '90s, know the Embrace when they see it. Github has already begun to Extend, with issues, web pages, wikis, and all the social networking stuff. This acquisition brings MS directly to the doorstep of Stage Three. I'd give them a year or two to set the hook before they make their move.
Nowadays they do seem more content with absolutely destroying the usability of every product they own. Except visual studio code, but then they have always tried to appeal to developers through their tools. They just forgot about it for a bot.
I am reminded of the Ship of Theseus[0]. What percentage of people employed at Microsoft in the 1990s now remain some 20-30 years later? At what point is the company considered changed?
I would say companies change more than people change. Companies are made of people, but generally not the same people decades later, which allows for quite a bit of change.
Now if I could only get myself to believe this about Uber... Then again, it hasn't really been all that long.
I trust Microsoft more than I trust Google these days. I don't remember when that transition happened in my mind, but Microsoft has been actively working at being better, while it seems Google has been actively working at being worse.
As far as I can tell they have actively been working at looking better to developers, at the same time they have had to be essentially forced by public opinion to not silently, and by default send your entire life to their server because you used their OS.
I know Google does many similar things, and it's as bad, and they probably getting worse. But Microsoft didn't use to do that, they did a lot of suspect things, but not that. So they are also getting worse.
They have always tried to woo developers, haven't always succeded, but it's in their DNA. So if that has stayed there all these years, what else have?
I’m really sick of hearing abou “oh the 90s Microsoft” you know what’s far scarier right now? Google, Facebook, the American Government. But you’re more worried about what some people did a ~20 years ago.
Maybe not, but they won't get my money anyway, not yet. Companies rarely change, how they want to be perceived, how they present themselves does change.
A point that nobody mentioned yet is the evolution of Electron with this purchase. I would not be surprised if they were to put their own trackers in the core lib so that they can access the data that they do not have from all other platforms and products (seeing that many popular apps like Atom Slack etc. are built with it and are not Microsoft products). Maybe this is their next effort towards the personal info/Big brother business model seeing as almost all other efforts failed (no matter how good win10 are they always have the stench from win8/Candy Crash and will always be too slow and insecure for business purposes - no useful data).
Either way, hello Qt5 and Gitlab! By the end of the week I hope that Gitlab will be our new home.
>If you want nice things, you have to pay for them.
Why the hell do I have to pay for something that has already been offered for free? That's called bait and switch. And maybe this is a stretch, but I think that should not even be legal. In my mind it it is a form of false advertising. If you don't think it will sustain itself as a free service in the future then don't offer it as a free service in the first place.
There’s also a proposal https://gitlab.com/gitlab-org/gitlab-ee/issues/4517 to support federation between GitLab instances. With this approach there wouldn’t even be a need for a single central hub. One of the main advantages of Git is that it’s a decentralized system, and it’s somewhat ironic that GitHub constitutes a single point of failure.
In theory this could work similarly to the way Mastodon works currently. Individuals and organizations could setup GitLab servers that would federate between each other. This could allow searching for repos across the federation, tagging issues across projects on different instances, and potentially fail over if instances mirror content. With this approach you wouldn’t be relying on a single provider to host everybody’s projects in one place.