Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: How did Git become the standard when Windows is the majority OS?
22 points by trealira on Oct 26, 2023 | hide | past | favorite | 44 comments
I'm young enough that Git became the standard before I was even into programming, so I didn't witness this happen in real time.

On Windows, Git requires MSYS2, Bash, and some other tools from Linux to run, although the installer is nice enough that it's not too much of a hassle. However, from what I've read secondhand, it wasn't always nice.

According to Statista, 61% of software developers used Windows in 2022, and it was similar in precision years; however, it seems like some amount were dual-booting (Windows + MacOS + Linux/other Unix adds up to more than 100%) [1].

Given that, why didn't Windows users establish something like Mercurial as the standard, given their majority/plurality status? Or did something like that happen?

[1]: https://www.statista.com/statistics/869211/worldwide-software-development-operating-system/




Windows used to have a version control called Visual Source Safe.

You'd have to select files you want to modify and check out them. Then they're locked. No one else in the team can touch them.

God forbid you forgot to check them in at the end of the day and are not at work for a week then those files are locked and don't even have your changes.

Yet to talk about branching.

Better then that was svn which was no less horrible.

So net result was that around 2000s, there was no concept and culture of version control for the windows developers. Pretty much. It was difficult to do, was cumbersome and such.

Then came git. Offline, distributed, instant branching without creating a whole copy of the project and what not. It came from Linux but it's ease of use and feature set made it defacto in Linux world.

But then came GitHub and that's when I guess most Windows Developers got to know about.

True story, I joined a team in late 2000s and their workflow was to literally FTP the code daily to production. They all were on Windows and didn't know any better.


VSS… Oh the memories. Of dealing with corruption and other issues.


> Windows used to have a version control called Visual Source Safe. > You'd have to select files you want to modify and check out them. Then they're locked. No one else in the team can touch them.

> God forbid you forgot to check them in at the end of the day and are not at work for a week then those files are locked and don't even have your changes.

Holy shit, this is what lawyers still do--I always thought the developers who built the shitty enterprise software used by law firms knew this was garbage "version control", and wondered what they thought of it--but now you're telling me that the developers of that old dotcom era software themselves used a similar flow? for software engineering? insane


I had an employer who used MS-DOS + Novell Netware. Source was distributed with the application. If you needed changes, you had to pull the source file off the machine and send it to the developers. So the "same" application on different machines could have all kinds of minor differences.

I hacked autoexec.bat to load the network and set some variables that identified each machine, then run another batch file stored on the network drive that did automatic weekly backups, copied new files back to individual machines on restart (without me having to run around with a floppy disk), etc.

Once I got it set up, I did all further administration from my office, except the times I had to use the backups, in which case I would just xcopy all the files back onto the machine and run sys c: from my floppy disk.


Nice lol


We used to use a VCS called Borland (yeah, Delphi Borland) StarTeam, which was basically the exact same flow. Luckily for us they audited us at some point, decided we were using too many licences, and insisted we triple our monthly expense or stop using StarTeam - that was the incentive to persuade business to let us slow down dev for a bit while we changed to Git.


SVN (and CVS) was way better than VSS.

SVN was open source so there were multiple clients. And it was web based. And branching made sense. And you didn’t have to lock out files.

And the VSS client was just really horrible and only worked on windows. SVN worked everywhere.


And Bitkeeper (that many of the concepts in git and mercurial came from) was better than SVN and CVS.


First came Visual Sourcesafe, then came Team Foundation Server. Which was widely used in tons of Microsoft solution provider across the early .NET era.

Git basically took off because of both Github and several major open source stacks gaining serious traction, even in the Microsoft solution space. In particular Node and several frontend focused stacks such as Bootstrap and Angular.

If that didn't happen I do not think git would have ever made it into Team Foundation Server, which caused it to become the most popular VCS in the Microsoft ecosystem.


> Then came git.

Not quite. Then came Mercurial, which preceded git. Mercurial has the distributed VCS features of git but with a sane UI. And because it's just a Python program it runs anywhere.

I think Mercurial lost because Linus either didn't know about it or didn't like it and wrote git in response to the Bitkeeper scandal. Then github came along and didn't support Mercurial, and that sealed Mercurial's fate.


thank you, i love stories like this. i've never touched mercurial, svn, etc yet i always fascinate on how git became preferred by the masses.


Short answer: It was better than everything else, and the competition really sucked.

Long answer: A big factor was that new devs already knew Git because it had taken off spectacularly in the open source world. In only a few years SVN was legacy and everyone switched to Git (and Github) for open source.

Windows shops that had been using SVN had a very natural transition to Git, with lots of tooling and experience available.

But there were indeed a lot of Windows shops that used other, mostly commercial and centralized version control systems before. While they might have had some advantages over CVS and SVN or when used for enterprise, those advantages disappeared when compared to the new distributed version control systems like Mercurial and Git.

Even though Git is not exactly easy to learn, it was still vastly better than the old version control systems. In many ways the old systems just really sucked in comparison.

As for Git Bash: You could always install Git through Cygwin, but I don't remember Windows people using the command line much. There were many Git GUI applications for Windows, and IDEs provided integrations as well. E.g. Eclipse came with a completely commandline-free Git client and integrated UI that worked flawlessly on Windows.

As for Mercurial vs Git: it wasn't completely clear from the start that Git would "win", as they are very similar. I think network effects, Github, and the Linux kernel halo effect are mostly responsible for the difference in use.


Thanks for the answer. Between what you, seabass-labrax and wg0 have said, I think I understand what happened better now.


What's maybe also interesting is that the whole transition was very developer-driven, at least in my circle. It wasn't management deciding from the top.

Around me it was just everyone that wanted to use Git: potential hires, junior devs, senior devs, etc. Even the older devs with 20+ years of experience liked it well enough to not oppose the change.

And management was also happy because it didn't cost anything to run a Linux server with Git + GitWeb installed.


This matches my experience as well. When I was making the choice for switching my team from SVN to a DVCS ten or so years ago, it was not at all obvious which option was better. Went with Mercurial because Windows support was better.


I think you're right about GitHub. I preferred Mercurial but GitHub was a real draw for the git side =)


I would say, as an entirely personal impression, most Windows application developers didn't use version control systems at all throughout the 2000s.

By being the dominant operating system, Windows would have had a wider distribution of developer skill levels than Linux at the time: both the really unskilled ones and the extremely highly skilled ones, plus plenty in between. The upper end of the competence distribution would be able to install Git even on Windows, and I believe that the majority of those who did use VCS would use a proprietary, Windows-only tool such as Team Foundation Server, especially in a corporate context.

But why didn't Team Foundation Server and its ilk completely take over? I would say that the lack of FOSS licensing was a very big part, as well the fact that Linux and almost all of its distributions were using Git, making it a sort of mandatory thing to use to participate in the FOSS world.

By the time the Git for Windows installer had progressed enough to be really easy to use in the 2010s, the concept of open source software had become so well understood even in Windows-land that everyone just gave it a go, with GitHub being an especially important factor in that. It's perhaps a bit like how people were installing VirtualBox on Windows just to get a LAMP stack, as that's what all the books and blogs said you should use.

All my own personal perspective, but much as we sometimes lament, I truly believe that the average quality of build tooling in FOSS has always been massively better than that in the proprietary software world through the 2000s and 2010s.


I’d also add Microsoft’s hostility to open source back then was a factor: they told you that real developers used things like Visual Source Safe, which were expensive and terrible. VSS had global file locking - literally people yelling over cube walls “who’s working on <file>?” - and was slow and unsafe (the only VCS I’ve ever seen lose data irrecoverably). Needing to run the servers wasn’t something the average developer wanted to do, either.

That doesn’t excuse not using version control but after working at a couple of Microsoft shops I understood why some people stuck with the “period zip file” approach 10-20 years later than the Unix world.


Yes I remember when I applied for my first internships, several companies highlighted that they "used version control". It was a feature back then!


Indeed, "Do you use source control?" was one of Joel Spolsky's 12 questions - in fact, the first of those questions - to judge the quality of a software team in this 2000 article: https://www.joelonsoftware.com/2000/08/09/the-joel-test-12-s...


> On Windows, Git requires MSYS2, Bash, and some other tools from Linux to run

It is not required. It is, sure, the very recommended way, but it's not required. You can, and I've seen teams do it without much of a problem, use it from the Windows command line without MSYS nor Bash.

Not only that, but a lot of developers work within their IDEs, which would often have a plug-in that hides away many of the details of whatever VCS they'd be using. Large quantities of developers just learn the basic menu options in their IDEs, many without even knowing or caring about advanced features or how it worked or even what tool was behind it -SCCS, CVS, Subversion, Mercurial, Git, Perforce, Source Safe, Plastic, StarTeam or whatever.

Having said that, of course, some tools had their particular moments of popularity. CVS was widely popular. Shit, but widely popular because VSS was even worse. SVN replaced CVS to a large extent and for a while it was quite common.


I may be totally off base, but my impression is that since around 2000-2003 onward, developers using OS X and Linux became disproportionally influential and trendsetters of sorts.

Actually I think the popularization of Macs in the dev world may have played a big part. Linux devs were using git earlier, but especially from the mid-late 2000s onwards Macs started becoming mainstays in tech hub city startups and small companies, and with git working just as well on OS X as it does on Linux it quickly became the standard for that group of devs, which then spread to the larger dev sphere.

I remember back in the heyday of Rails, some time around 2012, when Windows users would post in community forums about having trouble getting Ruby, git, etc set up they’d promptly be told to either go install Linux or buy a Mac. In several dev circles around that time Windows users were oddballs.


I share your opinion. I started my undergrad CS program in 2004, and familiarity with a Linux / Unix shell was a de facto expectation. The school and CS department servers that we had to use ran Linux. The tools that we had to use were all open-source, Linux-first. I remember asking for help once and having an upperclassman scoff and say that he couldn't help me, with my Windows machine and GUI, because he just used some command line tool.

I don't think I had heard of Linux before that time, and within a few months I realized that I needed to immerse myself by wiping Windows from my laptop and using Linux for all my day-to-day stuff. Many people opted for a Macbook instead because it gave them the same command line interface. So I think that Windows has never been the majority OS among computer scientists and, therefore, developers.


TortoiseSVN is a subversion client integrates with Windows Explorer (SVN commands show up in right-click menu). Version 1.14.5 was released in September 2022, so some Windows users still use subversion.

https://tortoisesvn.net/


You don't need msys2 and bash. You can just install it with choco and the powershell integration is great.


The Windows dev culture is way less command line centric than *nix, most people use either their IDE or a GUI tool.

Visual Studio has supported Git since 2012ish, I think. TortoiseGit has been around since 2008ish.

So using Git on windows has never really felt "unixy" for most windows devs, until something goes wrong and you have to copy commands off the Internet into git bash and cross your fingers.

The main reason git won on windows is the same reason it won everywhere else, which is GitHub (and other git hosts).


> Given that, why didn't Windows users establish something like Mercurial as the standard, given their majority/plurality status

Most software developers use Windows and work on proprietary/internal software (and the second part was even more true when Git became the standard for open collaboration.) They—or rather, Microsoft—established Visual Source Safe (succeeded later by Team Foundation Server) as the standard for that kind of use.

SVN and then Git were standards for public collaboration, which (while it still involves lots of devs on Windows) was much more centered in the Linux/MacOS world (as what remains of the broader Unix/Unix-like world.)

Git was successful enough that places doing proprietary/internal development on Windows use it now, too, but that wasn't how it got established.


The bit advantage of Git is that you can fork anything, and keep your own forks synced with the world, or not. For small projects, you never have to worry about granting permissions or any of that, there's no one true version problem either.

My first version control system was pkZip and a stack of floppy disks. I went through SVN and Mercurial, but only as a way to interact with existing projects. Git is so much easier to deal with (especially once you grok that it just takes snapshots of everything). The Windows installation process is painless, so why not use it?

Git/Git GUI made it trivial to keep a Lazarus/Free Pascal project on a non-networked computer at work synced with a copy on a thumbdrive, then syncing that with a copy on my desktop, and at github, was all a piece of cake.


Nearly all devs were using Linux on the server regardless of Desktop OS. So it's not like we were unfamiliar with Unix, on the contrary, it's the one technology that binds all developers together.


I don't think this is true. There's a huge Windows server ecosystem but it's "dark matter" to Unix people.


Is that a huge ecosystem of windows servers, or an ecosystem of huge windows servers? I know it takes some CPUs to run that tosh, but still ...


Several comments note that there were no good alternatives. While Git was picking up, there was not only Mercurial mentioned by a few others here, but also Canonical Bazaar. https://en.wikipedia.org/wiki/GNU_Bazaar

I had personally found Canonical Bazaar to be better. It lacked behind Git in performance though, which they improved upon shortly. But perhaps it was too late by then.


my guess is that many programmers like git because of the free/libre folk. i personally like git because it has an easy to understand gui interface that many people understand if you are not tech savy. my fav gui interface is github desktop that i use to talk to my github account and make commits to my personal website. i wish the other git repository hosting sites like gitlab and codeberg had an easy to use gui. maybe somebody will make one


All the reason given are good, but I don't know how much I trust this chart. Attempting to click on any of the "source" links does nothing. Statista is a platform for visualizations that anyone can post to. It'd be nice to see the actual survey this claims to be based on.

Note that if you pick a year, let's say 2020, the percentages add up to 153%. That isn't typically the way percentages work as far as I understand them.


It means that many of them developed on more than one operating system. I myself have used Windows, Linux, Android, Raspberry Pi, and VAX/VMS in the last year or so.


Before git, there were systems like PVCS (that’s what I cut my eye teeth on).

That ran natively on both Windows and Unix.


I think because it just worked better than the alternatives. I remember working with subversion when I started at my first job. They switched to git a year later. This was a company mainly specializing in Java and doing development on Windows.


I also wonder this. SVN is probably enough for most use cases. I think gits adoption was helped by recruiters listing it under required technologies.


i haven't used windows in a really long time,but i believe gitbash is still a thing and makes things easy to use.

But git just crushes everything else in capabilities. Hg is decent and maybe more appealing to windows folks, but it's branching model just isn't as nice. (as least as i remember it -- could have changed over the years).


My answer here is very subjective (may be completely wrong). Still, with that caveat stated...

Wondering about why git became standard is a bit like wondering why mammals became big and more dominant a few tens of millions of years ago. It was a side effect of other bigger changes and being in the right place at the right time. Not that many other factors mentioned here were unimportant, but I don't think they're the high order bit.

By the early mid 2000's, Microsoft's dominance in the software space was under serious attack. The web was beginning to get to be a more serious place to build apps, undermining a lot of the market that MS had dominated for in-house apps (visual basic etc etc). Things like WHATWG, and then Safari, and eventually Chrome helped to decouple lots of apps from dependence on proprietary Windows technologies. Additionally, cloud computing was taking off in one form or another (VMware and virtualization, then actual cloud offerings like AWS), which made linux a much more appealing platform for doing a lot of server-side development. These two factors, I think, helped move unix/linux environments into more of a mainstream place in many developer's lives.

Not that they were not viable before, but endless stories about the titans of the new industry relying on commodity, cheap, open source platforms and technologies certainly influenced many organization's mindsets. (it probably doesn't hurt that the transition to .net wasn't the smoothest not to mention Windows Vista, but that's also probably not the high order bit).

Another interesting datapoint (also probably not a high order bit) is that in the mid 00's, the mac became VERY attractive among at least silicon valley developers. The presence of a linux-like development environment (Terminal etc) with a spiffy UI coupled with just being trendy (I think I'm allowed to say that, as a multi-decade  fanboy :-) ) lured many of the thought leaders away from Windows. I remember being shocked when a brand new coworker demanded a mac at work (this would have been around 2010) and actually got one... something that I'd never seen happen before. And I know of people who went to MS developer conferences in silicon valley in the same timeframe where everyone was using macs. I remember one observing: "Oh, wow. MS is in trouble".

Anyway, my own view is no one forced or made git the standard. it is more that "linux"/linux-like development environments became much, much more standard and accepted and at that point it was "the best" tool available. If MS hadn't lost its control over the software industry, lots of us might still be using visual source safe (or its successor).


when Windows is the majority OS.

Define 'majority OS'. :)

Windows is not the majority when you factor-out those Desktop users.


You can’t just hand wave away 95% of users


I haven't bothered to look at the Netcraft reports recently, but the last time I looked Microsoft had only about 30% of servers. The other 70 % were Unix/Linux.

The Internet runs on Linux mainly. Then of course most phones run Android, a Linux derivative. And Apple OSs are based on BSD, another UNIX.

Microsoft is pretty much relegated to the Desktop (as I pointed out above), and the Desktop is in decline due to the decline in numbers of PCs sold. And a lot of those PCs even, mine included, are purchased with the Windows OS which is immediately blown away and replaced with another OS, which is usually one of the Linux derivatives.


I did provide a link to Statista, which has statistics on OS usage of software developers




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: