Application Verifier is turned on by process name (i.e. "explorer.exe"). I think most of the time, developers turn on an Application Verifier feature, start just one process by that name, and then verify. But Bruce turned it on for a short-lived process that runs in a large build. So it runs thousands of times, and he hit this problem. It sounds like a great thing to fix, but I don't think it affects most developers using Application Verifier, and it definitely doesn't affect 'normal people' who are just using Windows on their laptop or desktop.
These issues are quite common and seem to grow out of that initial expectation that N stays small which later turns out to not be so true for cases discovered later on. Scrolling through some of the latest entries, even Chrome itself had some accidentally quadratic complexity.
Bundled 3rd-party bloatware is one thing, but often the problem would be the actual drivers or utilities that came with the machine.
"Utilities" could usually be removed, but if the drivers are bad there wasn't much to to except try an older or newer release.
The second biggest problem was mandatory antivirus software installed by IT departments. (That is, on my machine. For a lot of people a bigger problem was adware and spyware.)
For the benefit of those who are younger than me: Reinstalling stock Windows used to be standard procedure.
Today the situation feels a lot better but I still prefer Linnux.
Some of them I've tried to mitigate (I'll be looking over my suite of tools to see if any "file watchers" are causing issues now, as well as using the fantastic looking tool from the article to dive into this slowness I feel), but for the most part it's just a cost of working on Windows for me, and it's pushing me away.
With a fraction of the power, I can get magnitudes more "perceived performance" out of Linux or MacOS, and I'm now starting to use linux VMs more to get that performance back (which is such a weird sentence!)
There are things about Linux that I hate as a desktop OS (for me they always seem to "deteriorate" over like 6 months and need a reinstall to stay stable, and I've on more than 5 occasions installed something and restarted only to find the machine unbootable...), and there's things I hate about MacOS (the keyboards, the hardware requirements, being linux-ey enough that it feels familiar, but not enough that stuff just works for me), but I can't deny that when I work in those OSs, I'm more productive and spend less time waiting for my machine to do little tasks, and therefore have much less aversion to them (at this point I've almost developed a phobia of having to move, copy, or rename large numbers of files on Windows, and it shows as a lack of organization in projects because of how flakey and time consuming it can be).
I know it won't be easy, but MS is going to start losing devs and eventually users if this keeps up and the other OSs keep pulling ahead in percevied performance.
(As an aside, and just to jump the gun a bit with the expected replies, I'm fairly certain my Linux-as-a-desktop-os issues are self inflicted or come from a lack of understanding of something on my part, but I haven't had the time to dive into what they are, and a slow OS is better than a non-functional one. Hopefully my increased usage of VMs now will iron out those issues without as much risk)
I think this hasn't been an issue on windows for a while
npm a while back moved to a "smarter" way of deduping and laying out files in a more "flat" way to avoid the massive bloating and tons of copies of the same libraries. And other tools (like yarn) do the same (or better!).
And the long paths issue is solvable, but you need to use the funky `\\?\` path prefix, or the tool you are using needs to support long paths on windows (which annoyingly explorer.exe doesn't, so you are stuck using hacky workarounds or other tools to delete or modify those files.
Linux on the other hand is spectacular here. In my experience it's marginally better than Mac on the same hardware (and way better than Windows!). I threw Linux on a spinning disk Mac Mini a while back with a really slow disk and found that it didn't matter much. Macs with SSDs are so much faster in part because MacOS seems less efficient on the disk front and SSDs make that less of a bottleneck.
Linux is open source. Why hasn't everyone copied what Linux is doing? What is Linux doing?
Building a kernel on Linux would bring down my machine's interactivity to zero, even the mouse pointer would stop responding and this isn't even like two/three decades ago - this was in 2014 and I've never since touched the pile of crap that Ubuntu is.
Maybe because it’s not Linux but a BSD-based Unix?
Needing to update bash and coreutils on MacOS sucks, and even then I can't rely on all other developers I work with having those same tools installed and up to date and installed the same way. So I deal with the provided versions and their differences and quirks.
Even OS X would feel like state of the art.
Commercial unixes are basically dead, and for very good reasons. Good riddance.
Does Windows have no equivalent of `mktemp`? Getting unique identifiers is a common requirement.
I don't think there's a built-in to get unique names in arbitrary folders, other than the idiomatic timestamp method described.
I suspect that the other reason to not use some mktemp equivalent is that they wanted the names to have some logical order to them, it just turns out that ascending file numbers is a bad way to do this.
fd = open(log-$timestamp-$pid-$randomhex, O_CREAT|O_EXCL)
Making Windows Slower Part 0: Making VirtualAlloc arbitrarily slower
Making Windows Slower Part 1: Making file access slower