1. Starting especially a few years before 2000 but continuing today, the software industry is quite profitable, pays well, and has lots of openings, while the a academic job market in systems research continues to pay poorly and has much more limited openings. So if you want to do systems software research while also having an enjoyable quality of life, you might as well go to a company and get paid well instead of spending your days writing a thesis and grant proposals.
2. Computer science is a field where the cost of basic research equipment is low (a computer), and more interesting research environments generally are beyond the scale of academia (tens of thousands of hardware nodes, hundreds or thousands or more QPS of production load, etc.). That makes it quite different from e.g. biology or high-energy physics on one end where you usually need to be in academia to get access to the equipment or e.g. mathematics (including theoretical CS) and literature on the other where it doesn't matter where you are; in systems research you only get access to the equipment from being in industry.
That doesn't mean that systems software research, done in industry, is (or was or will be) irrelevant; it means that the narrower definition of "research" as "that which is done in academia" is inaccurate (including industry with the trappings of academia, i.e., people at Google or Bell Labs writing papers in academic journals and hiring people with Ph.D.s). Systems software research happens in industry and is quite relevant to itself.
Commercial research needs to keep in mind the existing legacy systems used by the sponsor. Innovations are more evolutionary instead of revolutionary as the field matures. They may be more tailored to observable pain points of the research sponsor. They may not be widely shared if they yield results providing a competitive advantage. While it may not demand immediate returns, commercial research does have an axe to grind. All of this hampers advancement in the field of computer science in general.
I also don't know if there's any kind of commercial research on the scale of XEROX PARC or Bell Labs. I can't think of any off the top of my head. Microsoft and Google do some pretty neat research, but I don't think they've shipped anything quite on a similar scale.
There's really no organization hiring the best talent to work on the kind of black swan events commercial research may miss. For example, I think it'd be cool to have a microcode-based OS; I've heard it would help with keeping operating systems secure. But who would fund it, and who would work on it? Right now it doesn't look like anybody would, and that might be what Rob is concerned about.
- Linux's read-copy-update synchronization mechanism. It has been described in papers, but you're better off following mailing list posts or LWN writeups.
- Rust's borrow checker and lifetime system. It's built on existing well-known ideas (e.g. affine types) and there's since been some academic work on formalizing it, but the specific system Rust uses has no direct precedent, is pretty novel, and was developed outside academia. (Note that Rust came out of Mozilla Research, which is far, far smaller than Bell Labs but also an organization that intentionally works on revolutionary and not evolutionary improvements.)
- libdill and Trio's structured concurrency, a solid theoretical framework for handling async/await-shaped problems without turning your execution into concurrent spaghetti. The techniques are not unprecedented, but https://vorpus.org/blog/notes-on-structured-concurrency-or-g... is a better framing of it.
I think the real impairment to OS research is deployment. If your idea isn't compatible with one of the existing OSs, in such a way that it can run a web browser, then nobody's going to use it. Heck, even Windows Phone couldn't get adoption. OS ideas that require people to completely rewrite applications and interaction paradigms are non-starters no matter what benefits they offer - unless they can fulfil a need that can't be fulfilled any other way. So quite a lot of work goes into bypassing the OS entirely for hardware-specific single-program networking applications, and everyone else has to keep with their existing paradigms.
Even totally plain-looking device could be full of innovative research: a network router which uses completely new kernel. A new network protocol or a compression algorithm. New programming language. Automatic verification and/or fuzzing tools. A network of internet of things devices which share no code with any of the existing OSs.
True. Although all those kinds of devices tend to prefer "free" over "innovative", and to keep the OS layer as thin as possible.
> A new network protocol or a compression algorithm. New programming language. Automatic verification and/or fuzzing tools.
To me those aren't really systems software, but that may be a matter of opinion?
Systems software research has come a long way since 2000.
If that line of thought were consistent, it would credit Babbage, or maybe Turing, as the last computer scientist to do something useful.
Granted - the tooling improved.
And the number one thing that could have gotten better in the last 19 years but didn't: security.
It's almost like human society is trying really hard to keep developers busy.
That is because the majority of people fundamentally do the same things with computers than they did 20 years ago. Browse the web, edit pictures, videos, put together presentations, document layout, spreadsheets, etc.
Of course now your home videos are in 4K instead of 320p, and webpages are 10MB of JS instead of 10k of text... but these are changes in scale, not in kind.
However, shiny features is what gets people attracted to your platform, so we get shininess (never mind if functionality actually gets lost in the process).
The perfect illustration of this for me is George RR Martin, a professional writer of indisputable success, doing all of his writing work on a 1980s workstation with WordStar 4.
In 2000, people mostly still used Windows 9x. A single-user system with no sandboxing and no built-in firewall.
NTFS 3.0 with file encryption support
Logical disk management for dynamic disks & expansion of a logical partition over multiple physical disks. Without a reformat.
Distributed file systems & hierarchical storage management.
MMC with group policy control, active directory, centralized event viewer for OS & application events, and system service management
Speaking of which, system services were a thing that actually existed and were managed (systemd fighting still continues, so Linux still hasn't "caught up" on this)
Plug & Play ACPI support (technically windows 98 was the first to support this but it was so broken it was a joke - Linux lagged by a few years and didn't really support it until 2.6).
User-mode print drivers
time service with SNTP support
This is an astonishing claim: what makes you think it hasn't gotten better? It's gotten a LOT better since 2000.
While better than Windows 9X, Windows 2000 was also horrendous with regard to security. That was the era where Windows saw so many exploits and worms, and their security practices so lax that because they started the firewall a few seconds after starting the network interfaces when booting, if you were connected to the internet without a separate firewall on boot (fairly common at that time) it was likely you would be infected by a worm in that few seconds of unprotected networking.
Anyone around at that time will remember the rampant worms infecting large swaths of the internet connected Windows machines. Code Red. Sasser. Blaster. Slammer/Sapphire.
Google Docs (and the subsequent migration of MS Office to web accessible forms) didn't come until even later.
Microsoft's web apps are a grim reminder of how desktop UIs have evolved backwards. (I’m in the midst of evaluating Office 365 as part of some IT transitions at work.) It's missing tons of features even compared to Word 2000. And it's a total pig. I thought Office was a pig before, but moving it to the Web made everything 10x worse. (Google Docs is less of a pig, but that seems to be because it has less functionality than Gobe Productive on BeOS.)
I’ll concede that Google Maps is better than what was available in 2010. It bet it would be even better if Google turned it into a Win32 desktop app.
I'd love to go back to the irreverent hacker spirit of the 90ies.
Those two are very different claims IMO. Who cares what the basic security models are if you are significantly more difficult to attack?
We can debate whether these were "innovative" or not but the fact is that in 2000 none of these things existed in anything beyond research if at all there: ASLR, stack canaries, RETGUARD, pledge, jails, seccomp, fuzzing, San/kSan/HWAsan (tagged mem), NX, signed bootloaders/secure enclaves. IMO, iOS took huge steps to isolate the different user applications from one another.
EDIT: I deleted a reference to SELinux. It was introduced only a handful of days before Jan 1, 2001 ;)
(amount of data to protect * number of systems that store or handle data * level of risk) - mitigation
...you'll probably agree that the mitigation mechanisms improved 100x but the risk improved even more.
How can operating systems research be relevant when the resulting operating systems are all indistinguishable?
Linux is the hot new thing... but it's just another Unix.
Although they are rooted in FP notions of purity and immutability, I would say that NixOS and Guix try to fundamentally change operating systems.
Does "started in academia" not count? Because that'd give you easy counter-examples, e.g. Scala, Spark.
* "development" as in making a technology usable, not software development
If you look at networking, recently, there has been the move towards new protocols (quic) that was the result of systems research looking at the deficiencies of tcp. Another area is consensus algorithms. We now have large scale real life deployments of consensus algorithms, for example Spanner and etcd.
The late 90’s and early 2000’s were a weird time where the hardware was improving so fast and taking software along for a free ride that a lot of software was good enough. Now, as we bump more into the end of Moore’s Law, we will be seeing more research and real life usage of multicore and heterogenous computing and libraries and languages and operating systems that try to make that easier.
Would you say that wasn’t the case during the past 20 years (2000-2019)? Or do you consider all that period to be “early 2000’s”?
Granted, whilst it is system-level it is not system software. And it has not yielded demos that people have regarded as cool, rather ones that have been received by some as horrifyingly worrying.
But it has definitely influenced industry.
OS X was modern for its time, but where they’ve really pushed the envelope is with iOS. They can simply move faster at scale than anyone else because they almost entirely own the IP for both the software and all major hardware components and can pivot on a dime compared to market-based coordination.
There was almost nothing innovative about OS X, even when it came out. It was just packaged and marketed very well. Objectice-C and NeXTSTEP was a user land improvement over typical C user lands, but that's not saying much.
> OS X was modern for its time
It really wasn't. The Mach "microkernel" was from outdated 80s research. It's bloated, slow and inflexible compared to the state of the art at the time.
iOS was innovative at a UI/UX level, definitely. But I can't really think of anything they did at a systems level that was at all innovative?
Oh look, a paper; "For example, a recent case study found C with MPI is 4.6–10.2×faster than Spark on large matrix fac-torizations on an HPC cluster with 100 compute nodes"
Does it sound like large data analytics would have horribly stagnated?
I'm amazed at how many comments resolve around "But wait, of course systems research has evolved, see XX and YYY", followed by responses along the lines of "Nah, he was not talking about XX and YYY, rather ZZZ, etc..."
I hate being the "please define xxx" guy, but is there a consensual definition of what "Systems software" is ?