
Making Windows Slower Part 2: Process Creation - nikbackm
https://randomascii.wordpress.com/2018/10/15/making-windows-slower-part-2-process-creation/
======
copper_think
Application Verifier is a feature of Windows for developers to use to check
program correctness. Bruce says he turned on Full Page Heap, which makes every
allocation go on a separate page, and tries to line it up so that the next
page is not readable or writable. The idea is, if you go past the end of your
memory allocation, the CPU raises a protection fault, and you catch the memory
corruption bug at its source. Application Verifier can also do other things,
like raise an error if your program ever passes an invalid HANDLE to a Windows
system call.

Application Verifier is turned on by process name (i.e. "explorer.exe"). I
think most of the time, developers turn on an Application Verifier feature,
start just one process by that name, and then verify. But Bruce turned it on
for a short-lived process that runs in a large build. So it runs thousands of
times, and he hit this problem. It sounds like a great thing to fix, but I
don't think it affects most developers using Application Verifier, and it
definitely doesn't affect 'normal people' who are just using Windows on their
laptop or desktop.

~~~
brucedawson
I do remember a coworker at another company complaining that their hard drive
was filling up and it turned out the problem was the App Verifier log files.
In that case the individual log files were enormous so a few dozen (hundred?)
runs of the program under test was enough to be a problem. Different, but
related, and perhaps more common than the problem which I hit.

~~~
pletnes
If the log files aren’t cleaned up, you’ll hit this issue even if you don’t
run it 30k times in a minute. I ran into a very similar problem with pytest
(python) creating sequentially numbered temp dir/files under /tmp on linux.
Result: full / partition.

~~~
PSZD
My version of this - OS (Linux) crashes due to overclock stability testing
filled EFI NVRAM, and I had to manually delete the crash dumps after running
into puzzling issues later on. Apparently it was possible (on some hardware,
at least) for the automatic dumps to brick your computer:
[https://bugzilla.redhat.com/show_bug.cgi?id=919485](https://bugzilla.redhat.com/show_bug.cgi?id=919485)

------
strmpnk
This reminds me of the Accidentally Quadratic series:
[https://accidentallyquadratic.tumblr.com/](https://accidentallyquadratic.tumblr.com/)

These issues are quite common and seem to grow out of that initial expectation
that N stays small which later turns out to not be so true for cases
discovered later on. Scrolling through some of the latest entries, even Chrome
itself had some accidentally quadratic complexity.

~~~
brucedawson
Accidentally Quadratic looks interesting. I've certainly found lots of
accidentally quadratic algorithms throughout my career - and written a few as
well. It's amazing how badly they can blow up.

------
eitland
The thing that made the biggest difference for me when it came to experienced
slowness was more or less always created by OEMs.

Bundled 3rd-party bloatware is one thing, but often the problem would be the
actual drivers or utilities that came with the machine.

"Utilities" could usually be removed, but if the drivers are bad there wasn't
much to to except try an older or newer release.

The second biggest problem was mandatory antivirus software installed by IT
departments. (That is, on my machine. For a lot of people a bigger problem was
adware and spyware.)

For the benefit of those who are younger than me: Reinstalling stock Windows
used to be standard procedure.

Today the situation feels a lot better but I still prefer Linnux.

------
Klathmon
Microsoft really needs to solve or at least mitigate these issues.

I've been using Windows as my main development environment for mainly
JavaScript and Python for like 5 years now, and I've felt every single one of
these pain points (if you haven't, go read parts 0 and 1 of the same series!
They are great!).

Some of them I've tried to mitigate (I'll be looking over my suite of tools to
see if any "file watchers" are causing issues now, as well as using the
fantastic looking tool from the article to dive into this slowness I feel),
but for the most part it's just a cost of working on Windows for me, and it's
pushing me away.

With a fraction of the power, I can get magnitudes more "perceived
performance" out of Linux or MacOS, and I'm now starting to use linux VMs more
to get that performance back (which is such a weird sentence!)

There are things about Linux that I hate as a desktop OS (for me they always
seem to "deteriorate" over like 6 months and need a reinstall to stay stable,
and I've on more than 5 occasions installed something and restarted only to
find the machine unbootable...), and there's things I hate about MacOS (the
keyboards, the hardware requirements, being linux-ey enough that it feels
familiar, but not enough that stuff just works for me), but I can't deny that
when I work in those OSs, I'm more productive and spend less time waiting for
my machine to do little tasks, and therefore have much less aversion to them
(at this point I've almost developed a phobia of having to move, copy, or
rename large numbers of files on Windows, and it shows as a lack of
organization in projects because of how flakey and time consuming it can be).

I know it won't be easy, but MS is going to start losing devs and eventually
users if this keeps up and the other OSs keep pulling ahead in percevied
performance.

(As an aside, and just to jump the gun a bit with the expected replies, I'm
fairly certain my Linux-as-a-desktop-os issues are self inflicted or come from
a lack of understanding of something on my part, but I haven't had the time to
dive into what they are, and a slow OS is better than a non-functional one.
Hopefully my increased usage of VMs now will iron out those issues without as
much risk)

~~~
thrower123
To be fair, the way that JavaScript development works these days, particularly
anything involving npm, is batshit insane. I should not have 400 MB of
hundreds of thousands of tiny files sucked down into my node_modules folder
for a simple web front-end, with 17 different version of lodash required by
different packages, and dependencies nested so deeply that I can't even delete
the directory without resorting to specialized tools. It's a complete disaster
as far as being able to verify your dependencies, and we're sort of just
blanket operating on the assumption that everything is always licensed with an
MIT-alike license.

~~~
farnsworth
> and dependencies nested so deeply that I can't even delete the directory
> without resorting to specialized tools

I think this hasn't been an issue on windows for a while

~~~
Klathmon
It is still an issue, but it's not nearly as common any more with javascript
development.

npm a while back moved to a "smarter" way of deduping and laying out files in
a more "flat" way to avoid the massive bloating and tons of copies of the same
libraries. And other tools (like yarn) do the same (or better!).

And the long paths issue is solvable, but you need to use the funky `\\\?\\`
path prefix, or the tool you are using needs to support long paths on windows
(which annoyingly explorer.exe doesn't, so you are stuck using hacky
workarounds or other tools to delete or modify those files.

------
gwern
> Microsoft could fix this problem by using something other than a
> monotonically increasing log-file number. If they used the current date and
> time (to millisecond or higher resolution) as part of the file name then
> they would get log file names that were more semantically meaningful, and
> could be created extremely quickly with virtually no unique-file-search
> logic.

Does Windows have no equivalent of `mktemp`? Getting unique identifiers is a
common requirement.

~~~
LeifCarrotson
It does, but it puts them in the %appdata%/Temp folder. Which can be deleted
by disk space cleanup.

I don't think there's a built-in to get unique names in arbitrary folders,
other than the idiomatic timestamp method described.

~~~
electroly
You have it backwards. GetTempFileName() _only_ gives you unique names in
arbitrary folders! If you want to use %APPDATA%\Temp, you have to call
GetTempPath() yourself and pass that path into GetTempFileName().

------
Forge36
I've definitely seen this practice, the solution I saw to "fix" this: after 20
attempts fail. Silently. It was a static file, but it didn't track last file
created (of course someone had copied this static file... But that's a
separate problem).

~~~
blattimwind

      while True:
        fd = open(log-$timestamp-$pid-$randomhex, O_CREAT|O_EXCL)
        if fd:
           break
    

Hypothetically unbounded runtime, O(1) in practice.

~~~
WorldMaker
Don't even bother with the while True. If you get a bucket collision with a
random number, just don't log anything, or just fatally crash because you
can't trust the platform's PRNG or don't have enough available entropy or are
just plain unlucky. Keep it proper O(1).

------
hyperman1
I'm starting to get the same kind of vibes as with Mark russinovich's
sysinternals 'the case of the ....' blog posts. Same investigative style. The
blog is unabashedly technical, hands on, and yet crystal clear.

------
mjevans
The other two blog posts linked at the top are also entertaining and
educational reading.

    
    
        Making Windows Slower Part 0: Making VirtualAlloc arbitrarily slower
        Making Windows Slower Part 1: Making file access slower

