Hacker News new | past | comments | ask | show | jobs | submit login
The engineer’s engineer: Computer industry luminaries salute Dave Cutler (microsoft.com)
233 points by rozzie on April 15, 2016 | hide | past | favorite | 75 comments



Dave Cutler is a legend, but he keeps a relatively low profile, so it was nice to watch the rare new video interview with him at the bottom. I especially liked what he said at the end:

"I have this little saying that the successful people in the world are the people that do the things that the unsuccessful ones won't. So I've always been this person that, you know.. I will build the system, I will fix the bugs, I will fix other people's bugs, I will fix build breaks. It's all part of getting the job done."

When I was a kid, I enjoyed reading "Show Stopper! The Breakneck Race to Create Windows NT and the Next Generation at Microsoft" that chronicled his move from DEC to build NT. Jeff Atwood has a quick review of it at http://blog.codinghorror.com/showstopper/


After reading this, I have so much I would give my right arm to ask him (similar to my sentiments on other luminaries). His focus on quality is a trait that often seems to go unrewarded in his very institution, and I often struggle to reconcile the success I see achieved with the merit based success often insinuated, and the act of "getting things done" often seems blocked with politics and bureaucracy.

Clearly however he's found success through this, I can only hope he sticks around long enough that I may selfishly brush shoulders and share some of his wisdom, I find there to be far too few who have managed to succeed to that level of notoriety/accomplishment while maintaining the principles he seems to uphold, and would love to learn more from those who have on how they approached surmounting their environment. (on that note, thanks for the showstopper link, the review was more than sufficient to motivate a longer read)


> His focus on quality is a trait that often seems to go unrewarded

In most companies focus on quality is something you're precluded from having, and usually it's in the same category as security and maintainability (refactoring). I've been in various planning meetings where I wasn't able to convince the dev team manager to allow time for such things, because they stubbornly wouldn't accept that these things will pay dividends sooner than they believe. As a result in most code bases it's: prototype -> production -> patch, patch, patch. This is the norm but there are also places that know better and you don't have to convince anyone of the benefits.


Refactoring can be its own source of problems. I've seen far too many refactorings that merely exchanged one mess for another. I'm a bit sympathetic to a manager who is skeptical of refactorings.


Definitely, I've seen those kind of refactorings too, but what I mean is that one isn't allowed to fix the architectural problems that make it hard or impossible to implement what's requested. The existing design didn't anticipate certain things which are now impossible to implement without compromising the architecture.


One of managers' most important jobs is to keep the devs unbothered by the outside world or misguided product managers. But this can go wrong when the manager filters stuff that would have been great to implement.


I read Show Stopper that when it first came out, it's a great read. It's up there with Tracy Kidder's "Soul of a New Machine", which I first read when I was a Data General apprentice, and whilst spending free time in college on the campus VAX clone (a Systime 8750).


After reading this article I was inspired to read Showstopper again. Last time I read it was early 2000s. I must have lent it out so I had to order a new copy. Looks like they reissued it a few years back...


Show Stopper is a great book. And Dave Cutler is a very unique person.

If you want to learn more about the WinNT kernel and Windows operating system in general, go no further than have a look at the compatible implementation (open source):

http://reactos.org

(similar to what Linux is to Unix, ReactOS is to WinNT series)


I just started reading this book, maybe half way through, it's very interesting!

Some of the technical analogies are a little weak, but all the quotes and anecdotes are great.


Yes, I agree. It's also a fascinating window into what an achievement Windows NT was, and I say that as someone who does not use Windows anywhere.

By mid 1993 Dave Cutler and his team released a multi-threaded, pre-emptive multi tasking pure 32 bit version of Windows that supported the Windows API and worked across three different hardware architectures (two of which were pure RISC processors). Apple, often lauded for being ahead of the curve, wouldn't release anything remotely matching those capabilities for eight years when OS X came out. When Apple developers were just starting to port over to Cocoa and Carbon, Windows had reached deep maturity and stability with NT and its successors (still labelled as NT versions intenrally) and basically all Windows software ran on it natively and had been doing so for years.

Windows isn't exactly my cup of tea but Microsoft was way, way ahead of the curve with NT. It was a superb achievement and I strongly recommend reading Show Stopper to get a sense of what the team sacrificed, went through, and achieved.


I ran the very first beta of Windows NT and it was glorious. I had never before experienced anything like it: you could compile and run your C code in a graphical debugger, and it would catch bad pointer errors by breaking the program right were the bug was. Fix, recompile, rerun.

16 bit Windows computers would crash reboot on errors like that, leaving you with the printf errorlog to find the bug. Sun and Silicon Graphics workstations did not have software like that, and were ten times more expensive. Linux was just an experiment then.

I was never able to use Windows 3.1 and 95 after that, I would search for programs that ran on NT for everything. I still do. Poorly written software crashed, but the Windows NT OS was immensely stable on the right hardware, from the very first beta.


The only thing that (IMO) came close was OS/2.

I really wish Warp had beaten Windows 95. It was astounding. The kicker was that the Windows VM in OS/2 was, in some ways, actually more stable than the native apps [1].

> Ironically, if you never ran native OS/2 applications and just ran DOS and Windows apps in a VM, the operating system was much more stable.

[1]: http://arstechnica.com/business/2013/11/half-an-operating-sy...

EDIT: From the same article:

> Meanwhile, Dave Cutler’s team at Microsoft already shipped the first version of Windows NT (version 3.1) in July of 1993. It had higher resource requirements than OS/2, but it also did a lot more: it supported multiple CPUs, and it was multiplatform, ridiculously stable and fault-tolerant, fully 32-bit with an advanced 64-bit file system, and compatible with Windows applications. (It even had networking built in.) Windows NT 3.5 was released a year later, and a major new release with the Windows 95 user interface was planned for 1996. While Windows NT struggled to find a market in the early days of its life, it did everything it was advertised to do and ended up merging with the consumer Windows 9x series by 2001 with the release of Windows XP.


OS/2 had a much nicer API (such as a message loop that did not need a Window), it was very stable, but it was also more expensive. The early versions had no GUI and hardly any applications. I tried an early version, but the only software I could find for it in my student network was a FORTRAN compiler :-). Pricing, lack of application software, and the alternative of Microsoft Office on Windows 95 killed it.

Personally I hated Windows 95. It introduced changes to Windows NT 4 that made the OS much less reliable. It took many years for the consumer market to get away from 16 bit Windows, in which both OS/2 and Windows NT were niche products.


That OS/2 2.0 fiasco is one of my favorite topic, with NT originally being "NT OS/2". I have a bad opinion of it.


And then Microsoft screwed up NT, over Cutler's objections, by putting in crap code from Windows 95 to make it "compatible" with programs that relied on quirks of Windows 95. It took many years to clean up that mess.

(I started with Windows NT 3.51, which was a very nice system. The 16-bit emulation module was entirely optional, and I configured it off. Worked fine, as long as you bought applications certified for NT. In Windows 95, 16-bit mode was an integral part of the system, and kludges such as 16-32 bit thunking were added so that the two modes were not so distinct.)


Did any of that crap code from Windows 95 go in the kernel, or was it just in user space?


A lot of it went into the kernel for Windows XP. It took until Windows 7 to clean up the mess inside.


They moved the entire graphics subsystem into kernel because it was "too slow" in user mode


Yes. Then Microsoft hired Mark Russinovich, the guy who ran "ntinternals.com" and demonstrated it wasn't too slow, to shut him up.[1]

[1] http://windowsitpro.com/windows-server/did-microsoft-shut-do...


Not to detract from the achievement, but the transition to NT wasn't quite as rosy at that.

It took many years for apps written for Windows 3.x/95/98/etc. to catch up and run correctly on NT. In the early years, app and game producers had a distinct bias in favour of Win9x (popular consumer OS) and to the detriment of NT (esoteric, less shiny business/server OS).

The game compatibility situation, in particular, was quite miserable until Windows 2000 ("NT 5.0"). 4.0 had adopted the Win9x UI, but Win2k was arguably the first version that managed to match the consumer-oriented experience of Win9x, in particular with regard to DirectX.


I ran an NT4 / 98 dual boot system for years, and in practice the only applications that required rebooting into 98 were games and things that had a hardware component (scanner, digitizer, ...). So in my recollection the transition was actually quite smooth, games left aside.


One of the things that didn't run on NT4 was the AOL client, which was a pretty big thing for consumers at that time.


> The game compatibility situation, in particular, was quite miserable

Win 2000 already supported DX8 and most games worked fine. The step to WinXP was very small (and for games interesting: DX9 and "compatibility mode" shims). Only the setup routines were sometimes a problem because they checked for Win9x or even actively detected WinNT and aborted.

Though many with low end hardware got bad performance, that was the problem. Win95 required 4MB memory. Win98 16MB. WinME 32 MB. People with an old PC tried WinNT4/2000/XP and it run slow, no wonder.

The NT line was viewed as a resource hog and viewed as over-architecture with a HAL and Win32 running as sub-systems. In 1996 WinNT 4 already needed 32 MB. And Win 2000 128 MB. (I had a notebook with Win2000 with just 128MB and it wasn't flying, it barely run.) WinXP was also viewed as a resource hog by doubling the memory requirement within 1 1/2 years to 256MB as minimum requirement, only with 512MB it run really fine. (I bought a new PC with 512MB in 2001 and never looked back to Win9x DOS based Windows line. 99% of all games worked just fine, and for the rest there was Dosbox/Bochs/Quemu/VMware.)

I wish one could buy a Win10 build that comes with Win10 kernel, the Win2000 shell and no spying & tracking crap. A modern OS still could be very fast and consume a lot less hardware resources.


I had a run of dual core machines starting in the mid 90's and moved most of my work/etc to NT around NT 3.51 (games and such were still on win9x). Knowing that one of my cores wasn't in use in 9x was a pretty strong motivator to boot back to NT. Especially for dev purposes.

That said, I too had issues with the beta's of NT 3.1. But that didn't stop us from moving all our products/etc to NT, with our first commercial sales/installs of a 32-bit clean server application in early 1994. In late '94 another company wanted us to port our wares to a high availability solaris, and I started down that path, and the project got canned when it became apparent that the HA hardware only ran an older version of sunos/solaris that didn't support threads, and our application's core was built around multithreading.


If you wanted to see an OS that really was way ahead of the curve then NeXT Step was it.

By mid 1993 it was already on the 3rd version (running on a Motorola 68040 no less) and was already doing everything that NT could.

I really love watching the old NeXT presentations on Youtube https://www.youtube.com/watch?v=H07Xjom_GQA


Which video interview is that quote from?



Thank you :).


I worked on a project (while a Microsoft employee) where we were told that the coding standard was CNF (Cutler Normal Form). Naively I asked what this was, whether there was a document etc. The answer was to read Dave's code and write in the same style.


Do you remember any details from Cutler Normal Form? I'd love to learn more about this.

From a quick search I see that it uses braces at the end. I'd be curious if CNF encouraged subsystem prefixes (e.g. the "Ke" in "KeBugCheck").

I wonder if CNF had any influence on "Systems Hungarian" (e.g. "dwCreationFlags") that dominates the Win32 API or if that came as an accidental offshoot of Simonyi's "Application Hungarian"


I don't remember much hungarian in CNF, I do remember a lot of whitespace...

  //
  // Comments are always surrounded both by empty comment lines
  // And by whitespace
  //

  //
  // The return type is always on its own line.
  // The function name always starts at column 0 and is whitespace delimited,
  // this (allegedly) was a consequence of search macros that Cutler used
  //
 
  int
  main (
      int argc,
      char **argv)
  {
      //
      // Even a one line comment ends up taking up at least 4 lines
      //

      printf("Hello, World!\n");

      if (argc == 2)
      {
          //
          // All blocks always use braces
          //
      }

      return 0;
  }


Cutler Normal Form has open braces at the end of the line, so it should be

    if (argc == 2) {
   
        //
        // explanatory comment
        //
    }
Also, Dave hates Hungarian notation of any flavor.


Thanks for the example! I especially found the commenting style and return type on its own line to be interesting.

Regarding placement of curly braces, did you see them on their own line or at the end? This article seems to imply that CNF had braces at the end: https://blogs.msdn.microsoft.com/peterwie/2008/02/04/pedanti...

I'm interested because it seems that lower-level systems programming at Microsoft put braces at the end (early .NET BCL, maybe even the first C# compiler's C++ code, TypeScript compiler source, etc), but that Microsoft documentation tended to put braces on their own line and then that became the more de-facto style that shows up in their open source code.


I also used to work at Microsoft. CNF was primarily used in the kernel and driver code but rarely in userland code, with the exception of certain tools that were developed and maintained by the kernel team (such as powercfg.exe). CNF was one of the few coding styles at Microsoft that I really enjoyed using due to how clean and disciplined it made the code. I felt that the style really reflected Dave Cutler's attention to detail and quality.

You can see more examples of CNF in the now open-sourced Windows-Driver-Frameworks: https://github.com/Microsoft/Windows-Driver-Frameworks


I've recently adopted CNF for Windows-based C work and I absolutely love it.


My recollection is that almost all the code I wrote at Microsoft used Allman style, the only exception that I remember was Midori which used K&R

https://en.wikipedia.org/wiki/Indent_style


Bell believes Cutler is the only engineer with the confidence to pull off NT as he did. “Almost anyone who would have been good enough to do NT would have insisted on a blank sheet for the spec,” Bell said. “Dave appreciates legacy and compatibility.”

That seems to be what is missing most with modern software development. Everyone thinks they can do it better, and in the end repeats the same mistakes while creating new ones.


Hmmm, I still wonder what Windows would look like today and Microsoft had chosen the same path as Apple: create an emulation box for old 'Classic' apps, create a clean new system for new style apps.


That's pretty much what they did, but as a fundamental architecture of the OS. The "emulation box" is the win32 subsystem. Applications interact with the API in the subsystem, which translates, for example, win32 calls to the native NT API. They also created a POSIX subsystem and an OS/2 subsystem.


You just jogged my memory about WoW (Windows on Windows) :)


When I was working on Xbox One, I got all the checkin mails. Whenever I had time, I'd look for Dave's changes and see what he did. Nearly all of them were over my head because they were so deep in the OS. The ones I understood were usually fixing other people's bugs or build breaks like he mentions in the video. It was definitely a highlight to see the actual work of someone so disciplined and dedicated to their craft.


I happened to come across this article recently, an article which I think demonstrates how competitive Dave was in his youth: http://www.lansingstatejournal.com/story/sports/2015/07/23/d...


Little known factoid:

When Cutler came to MS, and worked on NT, he added a cool VMS easter egg in the name WinNT:

W --> -1 --> V

N --> -1 --> M

T --> -1 --> S

VMS is/was a great OS. I was fortunate enough to work on it on a satellite communication system straight out of college. I recall adding the the TCP/IP drivers and achieving connectivity across large Wide Area Networks and setting up one of the first wireless ISDN video conference systems, in this case between former east and west berlin and moscow. There were many similar projects around the world and VMS was a lot of fun/stable to work with.


The Wikipedia article for Windows NT references this, but also references a statement from Mark Lucovsky stating that the original target for NT, the i860 processor, was codenamed "N-Ten" and that the "NT" name came from the target processor.

It would mildly be interesting to get confirmation of the real source.


I can only imagine the hate this guys going to get here for NT...but I'm a huge fan.

One of mt first jobs was running a VMS cluster...then went on to work with Windows. Enjoyed it all. Dave Cutler is brilliant.

Thanks Dave!


Core NT is, or at least was, prior to NT 4 putting graphics in the kernel, very very nice. It's mostly what Microsoft did on top of it, and their general stewardship of it that I have a problem with.

For the latter, I noticed a serious drop in quality starting with 3.51 SP2, which to the best of my ability on Google seems to have come out 9 months before NT 4.


Exactly. Windows 8 and 10 include substantial improvements to the underpinnings that would be worth the switch from Windows 7 if Microsoft didn't basically kill the UI. This stupid desire to change stuff for a new version, as done with cars, does nobody good.

- 3D widgets serve a purpose and make it much easier to see what's a widget and what's not

- Windows XP theme engine was great and it got killed in Windows 7

- Windows XP and Windows 2000 UI was to the point and didn't waste screen real estate like 7, 8, 10 do as well as current OS X and GNOME or KDE. First we complained open source desktops had no designers, now we should curse the day we wished for designers to get involved.

- all Windows tools and subsystems got much better and featureful and there's new filesystems and subsystems, but you cannot take the core and slap on a different desktop environment like you can with Unixes. Back in the day you could use a different Windows Shell, but I don't know if that's still possible.

I have this suspicion that Cutler has a private build of Windows on his machine which still has Classic UI, but the product managers would never allow that to get published, or else they would be wrong to have killed it off in the first place. This constant desire to change the UI is unbelievable and hard to explain. Imagine if new knives would follow the same mantra. Many things work best the way they are with no need to change, and if change is wanted, something like Windows XP's theme engine is more than sufficient.


Having switched to windows 10 after using OSX and Linux primarily for the past 10 years or so, I have to say its been fine for me. The UI is simple: hit the windows key and type the name of the program you want to launch. Switch virtual desktops with ctrl+win left/right. Snap windows to the left or right with win+left/right. And now, its actually usable for dev with the subsystem for linux feature.

Oh and by the way games just work, chrome just works, spotify just works, intellij just works and all of this can be installed from the cli with chocolatey.


Yeah, like I said, many things have been improved, but they come with as many regressions in the UI which are mostly done to chase wrong goals (tablet unification) or misguided design (go back to Windows 1 flat UI). In some ways Microsoft's Core Server variant might be a better Windows for development as was Windows Server 2003 back in the day, but I doubt it'd be fully functional on the desktop. These are just a few things that will hold back Microsoft compared to Linux and BSD, when you want to do more than just Visual Studio and Steam Games.


Holding back in what sense?

Personally, after a decade of dual booting between Windows and GNU/Linux, I have focused again in just Windows with GNU/Linux on VMs.

At work, I seldom see anyone using anything else other than Windows on the desktop, except for a few Macs that are shared across the company for the occasional iOS project.

At customer sites there is always a mixture of UNIX and Windows deployments on their server farms.


He's a great engineer and, together with his equally great team, he designed what turned out to be a great kernel powering countless machines around the world.

There is very little to hate about the actual NT kernel.

Kudos to Dave, instead.


I worked on VAX/VMS mostly with FORTRAN and loved the hardware and the DCL (digital command language). I ending up founding and building a small company around building RDBMS based manufacturing applications using VAX/VMS as the initial computer we supported. Then we moved to UNIX and were glad we did when DEC was starting to tank.

I went to NT as soon as it was available. It helped me with my productivity but it did take a long time for the applications to catch up with the OS.


No hate. Dave Cutler is the real deal with a long line of successful and groundbreaking systems. NT kernel is the work of a pro.

Why do you imagine that technical people will look down on him despite groundbreaking innovations and steady decades worth of work going back to VMS? That's hardcore on a level only reached by a few, and only those also mentioned in the article - Alan Kay, Vinton Cerf and Tim Berners-Lee, are equals to Dave Cutler.


> The small, high-powered team of former DEC engineers and one existing Microsoft employee (Steve Wood) began by spending six months creating a specification for the operating system ­­– the specification that now resides in the Smithsonian Institute.

I had to look for this, information is available here [1] but not an electronic copy. Some Googling turned up this [2] but I'm not sure how authentic it is.

[1] http://americanhistory.si.edu/collections/search/object/nmah...

[2] http://gate.upm.ro/os/LABs/Windows_OS_Internals_Curriculum_R...


Wait a minute, rozzie? You mean, Ray Ozzie is posting about Dave Cutler? Wow.


I just noticed who the OP is. Kind of cool to see Ray active on HN!


One of Ray's last comments was about Dave:

https://news.ycombinator.com/item?id=7540544


The article mentions that the original spec for NT is in the Smithsonian, and indeed it is so:

http://americanhistory.si.edu/collections/search/object/nmah...

I'd pay good money for a digitized copy. I wonder if there are any legal or Microsoft barriers to that happening someday.


One of my neatest hardware acquisitions back in the day was a MIPS Magnum R4000 workstation - running NT 4.0.

It was great, stable, and ran Windows and DOS apps in emulation (sort of like DEC's FX!32 for NT/Alpha).

Per Microsoft: "On RISC-based computers, Windows NT provides a 486 emulator that runs applications designed for 286, 386, or higher processors."


I audited a security course at UW a few years back, and one evening Cutler stopped by as a guest lecturer.


Sadly, it appears that Dave Cutler has not had much influence on the worldwide community of software engineers and programmers. I assume this is because his work is proprietary and he can't discuss it in public.


Dave Cutler is my personal aspirational model. When I am 70, I want to still be writing production code, rather than languishing in middle management.

Engineers who choose to "trade up" to management will always baffle me.


You have remember 90% of programmers work on fairly boring business applications.

They don't get work on cutting edge stuff like cutler.


Cutler: "I will fix the bugs, I will fix other people's bugs, I will fix build breaks."


It's a personal decision. No need to hate on it.

There was a recent entry here in HN where a developer discussed his/her regret at not having moved to the managerial path. I am trying to find it but it's taking me some time.

It's a tough decision.


I'd be interested in seeing this post as well. As someone whose intellectual firepower is probably 50/50 on convergent/divergent thinking styles, it's pretty difficult for me to find a fit. Project Manager seems like the closest role I can find that can satisfy my skills and tastes.



It's a question of salary. If you are not an engineering superstar you will often hit a hard limit on your salary if you are an engineer. You can make more as a manager.


Would you prefer to have a manager who wasn't an engineer previously?


A manager has to have the right skills to manage a project and lead people. This is usually not something you find in engineers, so like teachers, it's not the expert knowledge (e.g. tech) but soft skills that count most for a manager. Therefore, a great manager doesn't have to know to code, at all. I haven't seen a single dev lead who got promoted from developer to manager who actually managed to manage the team/project reasonably.


I would like to be offered the choice. That is, most managers come from "the business" or something, and have never been an engineer, and don't want to have been an engineer. Given a choice, I'd pick the manager that was an engineer, but those folks are pretty rare.


Say what you will but I still prefer the VAX to Windows.


Cutler is by far my favorite engineer. Great story and amazing accomplishments.


I don't really feel competent to judge all of this, but my feeling is that that Unix crowd respects Dave for his competence... they just disagree fundamentally in his design choices. I think this is fair - there is more than one way to skin a cat.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: