I would say, as an entirely personal impression, most Windows application developers didn't use version control systems at all throughout the 2000s.
By being the dominant operating system, Windows would have had a wider distribution of developer skill levels than Linux at the time: both the really unskilled ones and the extremely highly skilled ones, plus plenty in between. The upper end of the competence distribution would be able to install Git even on Windows, and I believe that the majority of those who did use VCS would use a proprietary, Windows-only tool such as Team Foundation Server, especially in a corporate context.
But why didn't Team Foundation Server and its ilk completely take over? I would say that the lack of FOSS licensing was a very big part, as well the fact that Linux and almost all of its distributions were using Git, making it a sort of mandatory thing to use to participate in the FOSS world.
By the time the Git for Windows installer had progressed enough to be really easy to use in the 2010s, the concept of open source software had become so well understood even in Windows-land that everyone just gave it a go, with GitHub being an especially important factor in that. It's perhaps a bit like how people were installing VirtualBox on Windows just to get a LAMP stack, as that's what all the books and blogs said you should use.
All my own personal perspective, but much as we sometimes lament, I truly believe that the average quality of build tooling in FOSS has always been massively better than that in the proprietary software world through the 2000s and 2010s.
I’d also add Microsoft’s hostility to open source back then was a factor: they told you that real developers used things like Visual Source Safe, which were expensive and terrible. VSS had global file locking - literally people yelling over cube walls “who’s working on <file>?” - and was slow and unsafe (the only VCS I’ve ever seen lose data irrecoverably). Needing to run the servers wasn’t something the average developer wanted to do, either.
That doesn’t excuse not using version control but after working at a couple of Microsoft shops I understood why some people stuck with the “period zip file” approach 10-20 years later than the Unix world.
By being the dominant operating system, Windows would have had a wider distribution of developer skill levels than Linux at the time: both the really unskilled ones and the extremely highly skilled ones, plus plenty in between. The upper end of the competence distribution would be able to install Git even on Windows, and I believe that the majority of those who did use VCS would use a proprietary, Windows-only tool such as Team Foundation Server, especially in a corporate context.
But why didn't Team Foundation Server and its ilk completely take over? I would say that the lack of FOSS licensing was a very big part, as well the fact that Linux and almost all of its distributions were using Git, making it a sort of mandatory thing to use to participate in the FOSS world.
By the time the Git for Windows installer had progressed enough to be really easy to use in the 2010s, the concept of open source software had become so well understood even in Windows-land that everyone just gave it a go, with GitHub being an especially important factor in that. It's perhaps a bit like how people were installing VirtualBox on Windows just to get a LAMP stack, as that's what all the books and blogs said you should use.
All my own personal perspective, but much as we sometimes lament, I truly believe that the average quality of build tooling in FOSS has always been massively better than that in the proprietary software world through the 2000s and 2010s.