Hacker News new | comments | show | ask | jobs | submit login
Systems software research is irrelevant (uwaterloo.ca)
27 points by badri on July 13, 2008 | hide | past | web | favorite | 11 comments



Here is the highest complement I can give to any computing science paper: it reminds me of certain Edsgar Djiksta's writing.

Here are some of the particular lines that catch my attention each time I read this (I end up seeing this presentation every year or so, through some link or forum or another):

"When did you last see an exciting non-commercial demo?" When did you last see an exciting commercial one ? It's all crap, not just the academic side. I quit the ACM because I got tired of reading "X 3.0 is the new X 2.0" articles. Remember back in the early days of shareware, when you could write a DOS program that did one thing, like a mortgage calculator, and it would get passed around all the BBS's and become famous for a couple of weeks ? Most current programing is just that, but they are re-doing the one-off fadish programs for the newer platforms, web and smart phones and etc.

"Instead, we see a thriving software industry that largely ignores research, and a research community that writes papers rather than software." I'm not sure this is that special to computing science. I think educational and research institutions quit mattering so much several decades ago.

"Linux's cleverness is not in the software, but in the development model, hardly a triumph of academic CS (especially software engineering) by any measure." I think the development model is very important, as important as the assembly line and interchangeable parts is to automotive engineering. One can design the best car in the world, but if it can't be cranked out at the rate of one every 10 seconds it will never matter in the world.

"invention has been replaced by observation" This applies to a lot of academic stuff theses days. At certain conferences, there are papers presented that are nothing more than graphs of the number of papers in that field, a kind of pitch to convince people a particular field is the new hot thing. I suppose they include their paper-about-papers in the next set of numbers for the next paper-about-papers.


Linux may fall into the Macintosh trap: smug isolation leading to (near) obsolescence.

Heh. Predictions of the future are always amusing... especially when you live in that predicted future :)

(I hear nobody is using that old "Linux" thing these days. And Macs; they're totally dead.)


That's not entirely fair; Apple's comeback has been almost miraculous. I wouldn't bet on most companies making that kind of recovery.


I don't think Apple's comeback (the OS-X based one) was miraculous. They took a basic Unix that the Plan 9 guys would have turned their noses up at, and put a reasonable user interface on it and worked their way through all the tedious parts of making a stable, useful operating system; and then, it did exactly what the linux zealots had been saying was possible with Linux -- it became very popular, and known for being more stable, easier to use, and more secure than Windows.

The miracle is that Windows survived. Given that Apple had a focus on hardware lock-in and the high end of the market, that left Windows a big hole to live in; Linux should have copied OS-X "look and feel" down to every detail, made it run even faster on cheaper hardware, and given Windows a run for it's money.

As one of those linux zealots, I intend this to be self-criticism.

Ubuntu is going to eventually do that, I think. It is not OS-X compairable yet, but it is improving.


Shouldn't be surprising, systems is a mature subfield at this point. There's still useful work happening for sure, but it won't be as sexy or as (regularly) revolutionary as it used to be. The discoveries with the biggest impact in any field happen early on, even if later research is equally novel or intellectually rewarding.


Mathematics started maybe 2500 years ago. Calculus wasn't discovered until less than 400 years ago. The crisis in mathematics spawned by non-Euclidean geometry didn't happen until less then 200 years ago. Symbolic logic was in the last century or two.

Biology is presumably older. No need to point out what we've done in the last century.

Herbal medicine goes back much further than we have records. But penicillin (from molds) wasn't discovered until the 1930s.

Ironworking goes back 3000 years. The Bessemer process was developed less than 200 years ago.

Archery and animal husbandry go back to the Stone Age. But less than 1000 years ago, new developments in horseback archery were crucial to Genghis Khan's conquest of most of Eurasia.

Plant breeders have been studying plant heredity and breeding more productive cultivars since the Stone Age. Then 50-some years ago we discovered the structure of DNA, and now we have a remarkable variety of projects based on it.

A "mature subfield" is just one that had its last big development a while ago. In the case of "systems," that development in 2000 had been perhaps minicomputers or C. Pike was right in 2000 but his polemic is not applicable to 2009.

Look what's happened since 2000. MapReduce. Eventual consistency all over the place. Software transactional memory (even if that turns out not to pan out, there are lots of people using ZODB already). Google App Engine. Virtualization (VMWare has 3000 employees) and paravirtualization (XenSource's work is based on the RTLinux approach pioneered at New Mexico Tech), which incidentally involve new operating system kernels. Google Native Client. Widespread use of mobile code in AJAX apps. Haskell and OCaml are becoming usable and have been used in a few widespread apps. The inner loops of games are written in Cg or HLSL. OS X introduced a new way to structure Unix graphics (which is now being used by Compiz). JIT compilation has gone mainstream in the JVM, the CLR, and Chrome v8. We build our GUIs in Flex or DHTML instead of GDI or Quartz or even X. The hot new software is all written in Ruby or new dialects of Lisp. The majority of internet traffic is BitTorrent, which didn't exist in 2000. People develop software in refactoring IDEs. We store our files in decentralized source-control systems that use self-certifying names, like Git and Mercurial. Sun's hottest new products are a filesystem and a profiler. Firefox is having XPCOM ripped out of it largely by means of automated transformations of C++ code. Transmeta failed but you can hardly claim they weren't applying systems software research in industry. TCP/IP now does service autodiscovery the way Appletalk used to, so groups of people with laptops collaboratively take notes in real-time with SubEthaEdit. Lightroom was largely written in Lua, from a university in Brazil; lots of games are scripted in it. Lots of other apps are written largely in Python, which came from a university in Holland and then CNRI. C and C++ are much less dominant than they were in 2000. OS X runs on CMU's Mach.

Today, systems software --- operating systems, networking, languages --- is an extremely active field today, with lots and lots of research being done that's highly relevant to and influential in industry. Most of the exciting demos I see these days are noncommercial or come from corporate R&D labs. There's still ossified and insular research but there's also a huge explosion of new and innovative research. Instead of a high-end workstation with Unix, X, Emacs, TCP/IP, Netscape, C, C++, Java, and Perl, now we have Unix (often in the form of MacOS or Linux+GNOME or KDE), X (with Compiz), Emacs (revitalized!) or Eclipse or SharpDevelop; Firefox with Firebug, Flash, Canvas, AdBlock Plus, and GreaseMonkey instead of Netscape 4, or just as likely some WebKit browser; TCP/IP but with zeroconf, IPv6, and 802.11; C#, Java, Python, Ruby, and above all JavaScript.


Speaking of Microsoft, C# and VB have picked up a host of features from academic languages. This eight-year old essay is obsolete.

-systems grad student (fair disclaimer)


As much as I love the new features, it only took them three years to add, 20-30 years after those features were developed. How innovative!

You can't evaluate LINQ statements in the debugger, because that would require incremental compilation, which Microsoft's stack is nowhere close to supporting right now.


Check out F#. Many new C# features are trialled there first.


Are you kidding me? I think it is more important than ever.


Here's some interesting systems software research:

GIGA+: Scalable Directories for Shared File Systems, http://www.youtube.com/watch?v=2N36SE2T48Q




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: