I'm young enough that Git became the standard before I was even into programming, so I didn't witness this happen in real time.
On Windows, Git requires MSYS2, Bash, and some other tools from Linux to run, although the installer is nice enough that it's not too much of a hassle. However, from what I've read secondhand, it wasn't always nice.
According to Statista, 61% of software developers used Windows in 2022, and it was similar in precision years; however, it seems like some amount were dual-booting (Windows + MacOS + Linux/other Unix adds up to more than 100%) [1].
Given that, why didn't Windows users establish something like Mercurial as the standard, given their majority/plurality status? Or did something like that happen?
[1]: https://www.statista.com/statistics/869211/worldwide-software-development-operating-system/
You'd have to select files you want to modify and check out them. Then they're locked. No one else in the team can touch them.
God forbid you forgot to check them in at the end of the day and are not at work for a week then those files are locked and don't even have your changes.
Yet to talk about branching.
Better then that was svn which was no less horrible.
So net result was that around 2000s, there was no concept and culture of version control for the windows developers. Pretty much. It was difficult to do, was cumbersome and such.
Then came git. Offline, distributed, instant branching without creating a whole copy of the project and what not. It came from Linux but it's ease of use and feature set made it defacto in Linux world.
But then came GitHub and that's when I guess most Windows Developers got to know about.
True story, I joined a team in late 2000s and their workflow was to literally FTP the code daily to production. They all were on Windows and didn't know any better.