Still too much Linux, though. Things running in containers on in VMs need far less of an OS, and a lot less OS state.
> System software became about managing large numbers of machines.
Yes, but is academic research on these topics relevant? Most of the progress on infrastructure software has come from the attempts of internet companies to cope with their big data problems. The people who implemented these systems have all received their phds from academic CS institutions and they used bits and pieces of existing distributed systems research, but they did their innovative work in the context of commercial companies, not academic CS departments.
> Good new languages were developed.
And what are they? A quick look at https://tiobe.com/tiobe-index/ (for the lack of a better resource) corroborates the thesis that the bulk of software is developed using the same old boring languages. Yes, they have evolved, but not much. Golang is, well, not exactly innovative. Rust still has a long way to go to be called a mainstream language.
> Things running in containers on in VMs need far less of an OS, and a lot less OS state.
Unikernels exist but is there a compelling non-niche use case for them? Docker acquired Unikernel systems and that's what we got so far: https://blog.docker.com/2016/05/docker-unikernels-open-sourc...
These are all great examples confirming the thesis that producing good relevant systems research is hard. On the one hand we are piling up gross inefficiencies on top of decades old technology, so improving the current state of affairs should be easy, right? On the other, software systems are inherently very open systems with a lot of stakeholders, so doing any kind of successful "clean-slate redesign" is almost unthinkable.
First, the title of the talk is "Systems Software Research is Irrelevant", not "Systems Building in Academic Departments is Irrelevant". e.g. the talk discusses plan 9. The distinction the talk makes, I think, is between commercial products and R&D not between industry and academia.
In that sense, I think the stuff out of google (mapreduce up through tensorflow) are good examples of that trend reversing
Second, industry R&D has almost always lead the charge on developing large systems in CS. That's not new.
Third, as you noted, "[many of] the people who implemented
[and more important lead the design of] these systems have all received their phds from academic CS institutions and they used bits and pieces of existing distributed systems research". One role of CS academic research is to build foundational ideas and then crank out competent researchers who are able to build real systems/algorithms/companies on top of those ideas. Just because TensorFlow was developed at Google rather than UW doesn't mean that academic research is now irrelevant.
> And what are they?
There's a lot of OCaml, Haskell, and Scala code in the world. C# and the entire .NET family is a veritable treasure trove of academic pl ideas making it into production languages.
> One role of CS academic research is to build foundational ideas and then crank out competent researchers who are able to build real systems/algorithms/companies on top of those ideas.
Educational role is very important, but do phd graduates build on top of their research? Instead it seems, they learn the foundations (which were "research" a few decades ago), do a phd project on some very specialized and inconsequential thing and then go off to do real stuff at commercial companies.
> Working on obscure, niche stuff is how you actually contribute to science.
That's what they tell you (my favorite example BTW is conic sections - obscure and niche for millenia before it became known that they describe the orbits of planets). But is the deluge of highly specialized obscure papers really the consequence of free play of scientists' minds pursuing their own interests? Or is it more a consequence of the sheer number of phd candidates and postdocs that each need to achieve sufficiently novel results on a reasonable time scale with a fairly certain chance of success (objectives that are obviously in conflict)?
Of course ground-breaking research can grow out of niche results. But for growth to happen someone should build upon and improve upon these results. It is difficult to build upon a research prototype that works barely enough to register a minimal improvement to some metric and is thrown out afterwards.
I'm also not from a biology background - but I remember from doing philosophy, it was much more useful to find a paper that made some trivial advance in a very specialized area that you were interested in, than ten 'general' papers. I mean, if you're building something, and somebody has written a paper that addresses a part of your domain, it's gold dust - even if it's generally too niche for anybody to bother reading, even if it's substandard work.
You're right that there are some kind of perverse incentives in the hothouse-production of phd theses, but I think in general, people should embrace the triviality and irrelevance of scientific work. Alchemy set out to answer the big questions of eternal life, and gold from lead, and ended up answering nothing. Science set out to answer questions like, how are colours in flower petals passed through generations? If you look at the history of 'big questions', it's way less illustrious than that of small, boring ones.
But then I thought, why not? It seems kind of antiquated for a server os to worry about actual hardware in this day and age. Funny how things turn out.
I hear lots of anecdotes from colleagues about academic research conferences in certain areas like "big data" where people are excitedly talking about "new" things that industry did years ago, but didn't publish in an academic journal. And then there's the people claiming they're doing "big data" because there's slightly over a million rows in their MySQL database.
Yes the industry does pump a lot of money into research, but the people doing the real work are mostly doing some Msc or Phd at university XYZ.
So it goes both ways.
Just to cite the two areas that are my main hobby when doing computing related stuff outside boring enterprise work.
As for OS, the universities are the only place left it seems, as all major vendors are very carefull when it doesn't bring short term profits, e.g. Midori.
The case is even worse when your OS is supposed to run on different hardware. I think this is the main reason that Apple, in general, has a much better UX: They only have to support a limited set of hardware.
Casey Murati has a great rant about this: https://www.youtube.com/watch?v=kZRE7HIO3vk
People assume this to be the case, without actually analyzing what hardware actually needs to be supported by operating systems these days. For internal components, most of the stuff in a PC is pretty standardized: AHCI and NVMe for storage, EHCI and XHCI for USB, HD Audio for sound (and it's almost always a Realtek codec). Apple doesn't have any significant advantage in any of those areas. They still have to support two out of three GPU vendors, the top two or three NIC vendors, and maintain a driver infrastructure that supports loading third-party drivers for any of the above when there are exceptions. (Sure, the expandable Mac Pro may be long-dead, but Thunderbolt docks and enclosures enable the same variety of components).
Apple's real advantage seems to be that they don't have to try to work around anybody else's broken motherboard firmware. Microsoft could take the same stance and declare it to be Lenovo's problem if your Thinkpad's power management is broken. Microsoft probably should have changed strategy there when the industry switched to UEFI.
I’m sure on paper Linux works with all my devices. In practice it does not.
That’s the gap Apple close with their hardware.
I think the lack of new operating systems since late-90s can be (partially) attributed to the simple fact that modern computing world is so fully-developed, we have the computers for everything, and enjoy all the state-of-art features and functionalities. You already have, well, all the things you have on your personal computer. Although desktop apps are getting crappier and crappier as always (which is another story), but some parts of underlying infrastructure are created and only understood by the best developers in the world (high-performance TCP stack, 802.11n, multimedia, 3D graphics, TLSv1.3, PCI-e, USB 3.0, virtualization, exploit mitigation, web browser engines, C10K-free web server). By creating a new system, you lost everything immediately and render the computer less useful, making it less attractive for both hackers and academic/industrial researchers alike.
Back in the early days, not too many things were being done on computers, and you had fewer things to lose.
Want to play Space War on PDP-7, but don't even have an OS? Then Unix!
Want to do some real work on 80s home computers? BASIC! Micro-Soft!
When Linus purchased his computer, he found that the computer only runs DOS and was effectively useless for him. So he ordered a copy of Minix, meanwhile playing a DOS game to kill his time. After Minix arrived he eventually got a usable programming environment, was able to got online to Usenet with a modem, and also got the Minix filesystem which was suffering from performance issues at that time.
This is everything he had for his computing.
Then he decided to create his own operating system. He wrote the kernel, ported glibc, bash, gcc, and the system boots, later he added modem support. Now he could do everything he was able to do on Minix. That's it! Naturally, a few years later, it became a community project and gained huge momentum.
Perhaps the early-90s was the last era, when hackers can "just" create a operating system in a tinker's way.
On the other hand, new OS are still being constantly created on platforms which don't have the burdens of workstations/servers, we still see many hobbyist/research/practical systems on microcontrollers every year. As they became more and more powerful, then it also runs Linux and the story ends there...
I think this is also one reason that retrocomputing is getting more and more popular, it brings back the "playground" nature of early computer systems, unlike the "factory" nature of modern ones.
The problem is the effort to actually build something half as usable as an existing free OS, so most just give up.
"Nobody uses them / they don't work / where are they?" were the answers, all which have good replies if you google for 5 minutes.
Hard to discuss with someone that considers what they know as everything that exists out there. It's a myoptic view of the present and the future.
IMHO, this happens when one has more opinion than knowledge. Pike, unfortunately, seems to suffer from a similar problem.
>Be courageous. Try different things; experiment. Try to give
a cool demo.
>Measure success by ideas, not just papers and money. Make
the industry want your work.