Ugh, this old troll. Something that took me a while to figure out: when someone says that "nothing interesting is happening any longer in [discipline]", it is virtually always code for "I feel old." Yes, things change and different periods of innovation are (of course) different, but it is wrong to cast value judgement on one era over another. Speaking as an aging technologist, I can say that it can be hard to resist this -- and in particular, it's tempting to view one's late twenties and early thirties as a Golden Age, and everything later as derivative and uninteresting. If it needs to be said: for those who are making this claim, they are often projecting their own lives onto their discipline, and may be confusing their own personal Golden Age with a broader one that may or may not have existed.
My domain -- operating systems kernel development -- has been particularly vulnerable to this (in part because it has become increasingly specialized) and this is something that I have heard for my entire career. Of course, there has been tons of innovation in the kernel over those two decades (and I've been lucky enough to be close to a bunch of it) -- and no one now would seriously go back to the systems from two decades ago. (If anyone disagrees, they should kindly put their money where their mouth is and run only systems software from 1994.)
So yes, systems software is still relevant -- and so are a lot of things that might feel "done." And if you are an aging technologist like me (protected class, baby!) and you feel tempted to tell these youngsters that there is nothing new under the sun, please check yourself: yes, you should educate the rising generation about the problems that have been solved -- but never go so far as to say that there is no room left for new ideas. Aside from the fact that it's a demoralizing thing to say to a younger technologist, history will likely prove you wrong!
Love the work you've done Bryan, but your response above does nothing to engage the salient points made in the link && elsewhere. Yes, we have better ways of getting performance metrics than truss, circa Solaris 2.5…and thank you!
Sorry, what salient points am I not engaging? The author of the linked piece is decrying what they perceive as a lack of "OS research and innovation" -- and I'm saying that OS innovation remains alive and well, if perhaps more technical and less accessible than it once was. I can of course be more concrete and rattle these things off -- but I feel safe in leaving software systems advances since 2000 as an exercise to the reader...
In the link, you can see a chart, showing the hardware capabilities of a high-end workstations, sampled at roughly ten-year intervals. In twenty-five years, hardware capability has changed dramatically, while the kind of software used to get work done remains static (mostly).
Why is that? Would you make the chart differently?
The utah2K paper referred to in the link is admittedly (by the author) a polemic, so readers must understand that the writer may be shading things in a way more subjectively negative. Of course, that doesn't make it a trolling attempt, unless you can think of a particular chip the author wanted to knock off of a particular shoulder...
You acknowledge that OS innovation "may be more technical and less accessible than it once was". Why is that? In what manner? I don't think that you and Pike are using "OS research & innovation" to refer to all of the same things at the same time.
Here (from utah2K):
"New operating systems today tend to be just ways of reimplementing Unix. If they have a novel architecture -- and some do -- the first thing to build is the Unix emulation layer.
How can operating systems research be relevant when the resulting operating systems are all indistinguishable?
There was a claim in the late 1970s and early 1980s that Unix had killed operating systems research because no one would try anything else. At the time, I didn't believe it. Today, I grudgingly accept that the claim may be true (Microsoft notwithstanding)."
...I can't see why that isn't as true today as when it was written fourteen years ago. You can say that our Unix systems have improved, but that's a matter of degree, and not of type. We've also made great progress in breeding lean pigs that grow to marketable size in record time...but no matter how fine they are, they are still pigs, and aren't lobsters.
This is non-sequitur, mainstream hardware has only evolved as well so why would there need to be a "forget everything you know" revolution? Conversely, is hardware research dead?
The biggest change from 2000 is prevalence of NUMA, and this was even available back then in SGI and Sequent/IBM systems. I don't understand why systems software would have to substantially change until something like memristors drastically change systems architecture.
For a while there has been less emphasis on individual systems and more effort put into distributed systems.. first with Hadoop style clusters and we're now breaking into Mesos style clusters that provide OS-like semantics and services over a network of systems.
But there's a lot of good systems research going on today. Some of it is additive, some subtractive, but it's exists and is important.
My domain -- operating systems kernel development -- has been particularly vulnerable to this (in part because it has become increasingly specialized) and this is something that I have heard for my entire career. Of course, there has been tons of innovation in the kernel over those two decades (and I've been lucky enough to be close to a bunch of it) -- and no one now would seriously go back to the systems from two decades ago. (If anyone disagrees, they should kindly put their money where their mouth is and run only systems software from 1994.)
So yes, systems software is still relevant -- and so are a lot of things that might feel "done." And if you are an aging technologist like me (protected class, baby!) and you feel tempted to tell these youngsters that there is nothing new under the sun, please check yourself: yes, you should educate the rising generation about the problems that have been solved -- but never go so far as to say that there is no room left for new ideas. Aside from the fact that it's a demoralizing thing to say to a younger technologist, history will likely prove you wrong!