Here are some of the particular lines that catch my attention each time I read this (I end up seeing this presentation every year or so, through some link or forum or another):
"When did you last see an exciting non-commercial demo?" When did you last see an exciting commercial one ? It's all crap, not just the academic side. I quit the ACM because I got tired of reading "X 3.0 is the new X 2.0" articles. Remember back in the early days of shareware, when you could write a DOS program that did one thing, like a mortgage calculator, and it would get passed around all the BBS's and become famous for a couple of weeks ? Most current programing is just that, but they are re-doing the one-off fadish programs for the newer platforms, web and smart phones and etc.
"Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software." I'm not sure this is that special to computing science. I think educational and research institutions quit mattering so much several decades ago.
"Linux's cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure." I think the development model is very important, as important as the assembly line and interchangeable parts is to automotive engineering. One can design the best car in the world, but if it can't be cranked out at the rate of one every 10 seconds it will never matter in the world.
"invention has been replaced by observation" This applies to a lot of academic stuff theses days. At certain conferences, there are papers presented that are nothing more than graphs of the number of papers in that field, a kind of pitch to convince people a particular field is the new hot thing. I suppose they include their paper-about-papers in the next set of numbers for the next paper-about-papers.
Heh. Predictions of the future are always amusing... especially when you live in that predicted future :)
(I hear nobody is using that old "Linux" thing these days. And Macs; they're totally dead.)
The miracle is that Windows survived. Given that Apple had a focus on hardware lock-in and the high end of the market, that left Windows a big hole to live in; Linux should have copied OS-X "look and feel" down to every detail, made it run even faster on cheaper hardware, and given Windows a run for it's money.
As one of those linux zealots, I intend this to be self-criticism.
Ubuntu is going to eventually do that, I think. It is not OS-X compairable yet, but it is improving.
Biology is presumably older. No need to point out what we've done in the last century.
Herbal medicine goes back much further than we have records. But penicillin (from molds) wasn't discovered until the 1930s.
Ironworking goes back 3000 years. The Bessemer process was developed less than 200 years ago.
Archery and animal husbandry go back to the Stone Age. But less than 1000 years ago, new developments in horseback archery were crucial to Genghis Khan's conquest of most of Eurasia.
Plant breeders have been studying plant heredity and breeding more productive cultivars since the Stone Age. Then 50-some years ago we discovered the structure of DNA, and now we have a remarkable variety of projects based on it.
A "mature subfield" is just one that had its last big development a while ago. In the case of "systems," that development in 2000 had been perhaps minicomputers or C. Pike was right in 2000 but his polemic is not applicable to 2009.
Look what's happened since 2000. MapReduce. Eventual consistency all over the place. Software transactional memory (even if that turns out not to pan out, there are lots of people using ZODB already). Google App Engine. Virtualization (VMWare has 3000 employees) and paravirtualization (XenSource's work is based on the RTLinux approach pioneered at New Mexico Tech), which incidentally involve new operating system kernels. Google Native Client. Widespread use of mobile code in AJAX apps. Haskell and OCaml are becoming usable and have been used in a few widespread apps. The inner loops of games are written in Cg or HLSL. OS X introduced a new way to structure Unix graphics (which is now being used by Compiz). JIT compilation has gone mainstream in the JVM, the CLR, and Chrome v8. We build our GUIs in Flex or DHTML instead of GDI or Quartz or even X. The hot new software is all written in Ruby or new dialects of Lisp. The majority of internet traffic is BitTorrent, which didn't exist in 2000. People develop software in refactoring IDEs. We store our files in decentralized source-control systems that use self-certifying names, like Git and Mercurial. Sun's hottest new products are a filesystem and a profiler. Firefox is having XPCOM ripped out of it largely by means of automated transformations of C++ code. Transmeta failed but you can hardly claim they weren't applying systems software research in industry. TCP/IP now does service autodiscovery the way Appletalk used to, so groups of people with laptops collaboratively take notes in real-time with SubEthaEdit. Lightroom was largely written in Lua, from a university in Brazil; lots of games are scripted in it. Lots of other apps are written largely in Python, which came from a university in Holland and then CNRI. C and C++ are much less dominant than they were in 2000. OS X runs on CMU's Mach.
-systems grad student (fair disclaimer)
You can't evaluate LINQ statements in the debugger, because that would require incremental compilation, which Microsoft's stack is nowhere close to supporting right now.
GIGA+: Scalable Directories for Shared File Systems, http://www.youtube.com/watch?v=2N36SE2T48Q