Dave Cutler is a legend, but he keeps a relatively low profile, so it was nice to watch the rare new video interview with him at the bottom. I especially liked what he said at the end:
"I have this little saying that the successful people in the world are the people that do the things that the unsuccessful ones won't. So I've always been this person that, you know.. I will build the system, I will fix the bugs, I will fix other people's bugs, I will fix build breaks. It's all part of getting the job done."
When I was a kid, I enjoyed reading "Show Stopper! The Breakneck Race to Create Windows NT and the Next Generation at Microsoft" that chronicled his move from DEC to build NT. Jeff Atwood has a quick review of it at http://blog.codinghorror.com/showstopper/
After reading this, I have so much I would give my right arm to ask him (similar to my sentiments on other luminaries). His focus on quality is a trait that often seems to go unrewarded in his very institution, and I often struggle to reconcile the success I see achieved with the merit based success often insinuated, and the act of "getting things done" often seems blocked with politics and bureaucracy.
Clearly however he's found success through this, I can only hope he sticks around long enough that I may selfishly brush shoulders and share some of his wisdom, I find there to be far too few who have managed to succeed to that level of notoriety/accomplishment while maintaining the principles he seems to uphold, and would love to learn more from those who have on how they approached surmounting their environment. (on that note, thanks for the showstopper link, the review was more than sufficient to motivate a longer read)
> His focus on quality is a trait that often seems to go unrewarded
In most companies focus on quality is something you're precluded from having, and usually it's in the same category as security and maintainability (refactoring). I've been in various planning meetings where I wasn't able to convince the dev team manager to allow time for such things, because they stubbornly wouldn't accept that these things will pay dividends sooner than they believe. As a result in most code bases it's: prototype -> production -> patch, patch, patch. This is the norm but there are also places that know better and you don't have to convince anyone of the benefits.
Refactoring can be its own source of problems. I've seen far too many refactorings that merely exchanged one mess for another. I'm a bit sympathetic to a manager who is skeptical of refactorings.
Definitely, I've seen those kind of refactorings too, but what I mean is that one isn't allowed to fix the architectural problems that make it hard or impossible to implement what's requested. The existing design didn't anticipate certain things which are now impossible to implement without compromising the architecture.
One of managers' most important jobs is to keep the devs unbothered by the outside world or misguided product managers. But this can go wrong when the manager filters stuff that would have been great to implement.
I read Show Stopper that when it first came out, it's a great read. It's up there with Tracy Kidder's "Soul of a New Machine", which I first read when I was a Data General apprentice, and whilst spending free time in college on the campus VAX clone (a Systime 8750).
After reading this article I was inspired to read Showstopper again. Last time I read it was early 2000s. I must have lent it out so I had to order a new copy. Looks like they reissued it a few years back...
Show Stopper is a great book. And Dave Cutler is a very unique person.
If you want to learn more about the WinNT kernel and Windows operating system in general, go no further than have a look at the compatible implementation (open source):
Yes, I agree. It's also a fascinating window into what an achievement Windows NT was, and I say that as someone who does not use Windows anywhere.
By mid 1993 Dave Cutler and his team released a multi-threaded, pre-emptive multi tasking pure 32 bit version of Windows that supported the Windows API and worked across three different hardware architectures (two of which were pure RISC processors). Apple, often lauded for being ahead of the curve, wouldn't release anything remotely matching those capabilities for eight years when OS X came out. When Apple developers were just starting to port over to Cocoa and Carbon, Windows had reached deep maturity and stability with NT and its successors (still labelled as NT versions intenrally) and basically all Windows software ran on it natively and had been doing so for years.
Windows isn't exactly my cup of tea but Microsoft was way, way ahead of the curve with NT. It was a superb achievement and I strongly recommend reading Show Stopper to get a sense of what the team sacrificed, went through, and achieved.
I ran the very first beta of Windows NT and it was glorious. I had never before experienced anything like it: you could compile and run your C code in a graphical debugger, and it would catch bad pointer errors by breaking the program right were the bug was. Fix, recompile, rerun.
16 bit Windows computers would crash reboot on errors like that, leaving you with the printf errorlog to find the bug. Sun and Silicon Graphics workstations did not have software like that, and were ten times more expensive. Linux was just an experiment then.
I was never able to use Windows 3.1 and 95 after that, I would search for programs that ran on NT for everything. I still do. Poorly written software crashed, but the Windows NT OS was immensely stable on the right hardware, from the very first beta.
I really wish Warp had beaten Windows 95. It was astounding. The kicker was that the Windows VM in OS/2 was, in some ways, actually more stable than the native apps [1].
> Ironically, if you never ran native OS/2 applications and just ran DOS and Windows apps in a VM, the operating system was much more stable.
> Meanwhile, Dave Cutler’s team at Microsoft already shipped the first version of Windows NT (version 3.1) in July of 1993. It had higher resource requirements than OS/2, but it also did a lot more: it supported multiple CPUs, and it was multiplatform, ridiculously stable and fault-tolerant, fully 32-bit with an advanced 64-bit file system, and compatible with Windows applications. (It even had networking built in.) Windows NT 3.5 was released a year later, and a major new release with the Windows 95 user interface was planned for 1996. While Windows NT struggled to find a market in the early days of its life, it did everything it was advertised to do and ended up merging with the consumer Windows 9x series by 2001 with the release of Windows XP.
OS/2 had a much nicer API (such as a message loop that did not need a Window), it was very stable, but it was also more expensive. The early versions had no GUI and hardly any applications. I tried an early version, but the only software I could find for it in my student network was a FORTRAN compiler :-). Pricing, lack of application software, and the alternative of Microsoft Office on Windows 95 killed it.
Personally I hated Windows 95. It introduced changes to Windows NT 4 that made the OS much less reliable. It took many years for the consumer market to get away from 16 bit Windows, in which both OS/2 and Windows NT were niche products.
And then Microsoft screwed up NT, over Cutler's objections, by putting in crap code from Windows 95 to make it "compatible" with programs that relied on quirks of Windows 95. It took many years to clean up that mess.
(I started with Windows NT 3.51, which was a very nice system. The 16-bit emulation module was entirely optional, and I configured it off. Worked fine, as long as you bought applications certified for NT. In Windows 95, 16-bit mode was an integral part of the system, and kludges such as 16-32 bit thunking were added so that the two modes were not so distinct.)
Not to detract from the achievement, but the transition to NT wasn't quite as rosy at that.
It took many years for apps written for Windows 3.x/95/98/etc. to catch up and run correctly on NT. In the early years, app and game producers had a distinct bias in favour of Win9x (popular consumer OS) and to the detriment of NT (esoteric, less shiny business/server OS).
The game compatibility situation, in particular, was quite miserable until Windows 2000 ("NT 5.0"). 4.0 had adopted the Win9x UI, but Win2k was arguably the first version that managed to match the consumer-oriented experience of Win9x, in particular with regard to DirectX.
I ran an NT4 / 98 dual boot system for years, and in practice the only applications that required rebooting into 98 were games and things that had a hardware component (scanner, digitizer, ...). So in my recollection the transition was actually quite smooth, games left aside.
> The game compatibility situation, in particular, was quite miserable
Win 2000 already supported DX8 and most games worked fine. The step to WinXP was very small (and for games interesting: DX9 and "compatibility mode" shims). Only the setup routines were sometimes a problem because they checked for Win9x or even actively detected WinNT and aborted.
Though many with low end hardware got bad performance, that was the problem. Win95 required 4MB memory. Win98 16MB. WinME 32 MB. People with an old PC tried WinNT4/2000/XP and it run slow, no wonder.
The NT line was viewed as a resource hog and viewed as over-architecture with a HAL and Win32 running as sub-systems. In 1996 WinNT 4 already needed 32 MB. And Win 2000 128 MB. (I had a notebook with Win2000 with just 128MB and it wasn't flying, it barely run.) WinXP was also viewed as a resource hog by doubling the memory requirement within 1 1/2 years to 256MB as minimum requirement, only with 512MB it run really fine. (I bought a new PC with 512MB in 2001 and never looked back to Win9x DOS based Windows line. 99% of all games worked just fine, and for the rest there was Dosbox/Bochs/Quemu/VMware.)
I wish one could buy a Win10 build that comes with Win10 kernel, the Win2000 shell and no spying & tracking crap. A modern OS still could be very fast and consume a lot less hardware resources.
I had a run of dual core machines starting in the mid 90's and moved most of my work/etc to NT around NT 3.51 (games and such were still on win9x). Knowing that one of my cores wasn't in use in 9x was a pretty strong motivator to boot back to NT. Especially for dev purposes.
That said, I too had issues with the beta's of NT 3.1. But that didn't stop us from moving all our products/etc to NT, with our first commercial sales/installs of a 32-bit clean server application in early 1994. In late '94 another company wanted us to port our wares to a high availability solaris, and I started down that path, and the project got canned when it became apparent that the HA hardware only ran an older version of sunos/solaris that didn't support threads, and our application's core was built around multithreading.
"I have this little saying that the successful people in the world are the people that do the things that the unsuccessful ones won't. So I've always been this person that, you know.. I will build the system, I will fix the bugs, I will fix other people's bugs, I will fix build breaks. It's all part of getting the job done."
When I was a kid, I enjoyed reading "Show Stopper! The Breakneck Race to Create Windows NT and the Next Generation at Microsoft" that chronicled his move from DEC to build NT. Jeff Atwood has a quick review of it at http://blog.codinghorror.com/showstopper/