
Systems Software Research is Irrelevant (2000) - Philipp__
http://doc.cat-v.org/bell_labs/utah2000/utah2000.html
======
dang
Recent discussions:
[https://news.ycombinator.com/item?id=19310845](https://news.ycombinator.com/item?id=19310845),
[https://news.ycombinator.com/item?id=18207317](https://news.ycombinator.com/item?id=18207317)

2013:
[https://news.ycombinator.com/item?id=6818231](https://news.ycombinator.com/item?id=6818231)

2009:
[https://news.ycombinator.com/item?id=686840](https://news.ycombinator.com/item?id=686840)

2008:
[https://news.ycombinator.com/item?id=245227](https://news.ycombinator.com/item?id=245227)

------
geofft
"What happened" was two-fold:

1\. Starting especially a few years before 2000 but continuing today, the
software industry is quite profitable, pays well, and has lots of openings,
while the a academic job market in systems research continues to pay poorly
and has much more limited openings. So if you want to do systems software
research while also having an enjoyable quality of life, you might as well go
to a company and get paid well instead of spending your days writing a thesis
and grant proposals.

2\. Computer science is a field where the cost of basic research equipment is
low (a computer), and more interesting research environments generally are
beyond the scale of academia (tens of thousands of hardware nodes, hundreds or
thousands or more QPS of production load, etc.). That makes it quite different
from e.g. biology or high-energy physics on one end where you usually need to
be in academia to get access to the equipment or e.g. mathematics (including
theoretical CS) and literature on the other where it doesn't matter where you
are; in systems research you only get access to the equipment from being in
industry.

That doesn't mean that systems software research, done in industry, is (or was
or will be) irrelevant; it means that the narrower definition of "research" as
"that which is done in academia" is inaccurate (including industry with the
trappings of academia, i.e., people at Google or Bell Labs writing papers in
academic journals and hiring people with Ph.D.s). Systems software research
happens in industry and is quite relevant to itself.

~~~
yingw787
Commercial research is decently different from academic research, which is
what I think Rob may be referring to.

Commercial research needs to keep in mind the existing legacy systems used by
the sponsor. Innovations are more evolutionary instead of revolutionary as the
field matures. They may be more tailored to observable pain points of the
research sponsor. They may not be widely shared if they yield results
providing a competitive advantage. While it may not demand immediate returns,
commercial research does have an axe to grind. All of this hampers advancement
in the field of computer science in general.

I also don't know if there's any kind of commercial research on the scale of
XEROX PARC or Bell Labs. I can't think of any off the top of my head.
Microsoft and Google do some pretty neat research, but I don't think they've
shipped anything quite on a similar scale.

There's really no organization hiring the best talent to work on the kind of
black swan events commercial research may miss. For example, I think it'd be
cool to have a microcode-based OS; I've heard it would help with keeping
operating systems secure. But who would fund it, and who would work on it?
Right now it doesn't look like anybody would, and that might be what Rob is
concerned about.

~~~
geofft
Some specific innovations off the top of my head that are pretty firmly
outside traditional academic research, and seem more revolutionary than
evolutionary:

\- Linux's read-copy-update synchronization mechanism. It has been described
in papers, but you're better off following mailing list posts or LWN writeups.

\- Rust's borrow checker and lifetime system. It's built on existing well-
known ideas (e.g. affine types) and there's since been some academic work on
formalizing it, but the specific system Rust uses has no direct precedent, is
pretty novel, and was developed outside academia. (Note that Rust came out of
Mozilla Research, which is far, far smaller than Bell Labs but also an
organization that intentionally works on revolutionary and not evolutionary
improvements.)

\- libdill and Trio's structured concurrency, a solid theoretical framework
for handling async/await-shaped problems without turning your execution into
concurrent spaghetti. The techniques are not unprecedented, but
[https://vorpus.org/blog/notes-on-structured-concurrency-
or-g...](https://vorpus.org/blog/notes-on-structured-concurrency-or-go-
statement-considered-harmful/) is a better framing of it.

------
cperciva
Virtualization. Capabilities. Kernel-bypass networking. Static code analysis.
Verified-pointer microarchitectures. Coverage-guided fuzzing.

Systems software research has come a long way since 2000.

~~~
jascii
Virtualization: See VMS (1977) Kernel-bypass networking: See microkernels
(1967) Need I go on?

~~~
geofft
I think it is a wild misunderstanding of how academic research works to say
that the first demonstration of a concept is equal to all further work on a
concept. It is like saying that the Human Genome Project isn't recent work
because the structure of DNA was discovered in 1953.

~~~
__jal
This is very common thinking - people seem to equate the first discussion of a
concept with "discovery" with "the important stuff".

If that line of thought were consistent, it would credit Babbage, or maybe
Turing, as the last computer scientist to do something useful.

~~~
sdenton4
And machine learning is still stuck in the sixties...

~~~
mateo411
It's stuck in the 1760s with the publication of Bayes Theorem.

------
tlb
The last 19 years of systems software research have not refuted Rob's thesis.
Industry has made incremental progress, academia has written papers but not
built much that people want to use. Despite massive increases in graphics
processing power, desktop UIs are still about the same as in 2000, just with
more shininess.

And the number one thing that could have gotten better in the last 19 years
but didn't: security.

~~~
wyldfire
> And the number one thing that could have gotten better in the last 19 years
> but didn't: security.

This is an astonishing claim: what makes you think it hasn't gotten better?
It's gotten a LOT better since 2000.

~~~
rayiner
I’d love to go back to Windows 2000 (and Google circa 2000). The software
industry (at least in the desktop side) peaked two decades ago, then spent
most of the last decade or so badly reinventing everything on the web.

~~~
harryh
Gmail didn't come out until 2004. You'd be stuck with Hotmail in 2000. Google
Maps didn't come out until 2005.

Google Docs (and the subsequent migration of MS Office to web accessible
forms) didn't come until even later.

~~~
rayiner
Gmail is lame compared to Outlook 2000. (It also broke self-hosted email for
everyone.) Likewise Google Docs can’t hold a candle to Office 2000 (or even
Word Perfect 6.1). It has extremely bare-bones control over text formatting
and page layout. _E.g._ no kerning, limited styling, no footnote styles,
limited control of header/footer formatting, no section breaks, etc. No
section breaks! The version of Word Perfect I installed from a stack of floppy
discs had section breaks!

Microsoft's web apps are a grim reminder of how desktop UIs have evolved
_backwards_. (I’m in the midst of evaluating Office 365 as part of some IT
transitions at work.) It's missing tons of features even compared to Word
2000. And it's a total pig. I thought Office was a pig before, but moving it
to the Web made everything 10x worse. (Google Docs is less of a pig, but that
seems to be because it has less functionality than Gobe Productive on BeOS.)

I’ll concede that Google Maps is better than what was available in 2010. It
bet it would be even better if Google turned it into a Win32 desktop app.

~~~
eli_gottlieb
>Microsoft's web apps are a grim reminder of how desktop UIs have evolved
backwards. (I’m in the midst of evaluating Office 365 as part of some IT
transitions at work.) It's missing tons of features even compared to Word
2000. And it's a total pig. I thought Office was a pig before, but moving it
to the Web made everything 10x worse. (Google Docs is less of a pig, but that
seems to be because it has less functionality than Gobe Productive on BeOS.)

Sure, that's all true, but this backwards devolution also ensures the
important thing: that you don't really own the code you run, a centralized
provider does, and they can change or break it as they please, without having
to remain compatible with your machine. This is a business model problem:
they've decided they do better off turning your general-purpose, user-
programmable personal computer into a dumb terminal that uses 10x bloated-ass
Javascript frameworks to make AJAX calls to their HTTP servers.

------
RcouF1uZ4gsC
I would argue that systems research has been incredibly relevant. First
consider programming languages. Even though languages such as Java, C++, C#
are all widely used, they are much different languages than they were in the
early 2000’s. You can see the influences of academic research especially from
the functional languages (monads) on these languages. Also, Rust is an
exciting new language that is enabled by the systems software research of the
past.

If you look at networking, recently, there has been the move towards new
protocols (quic) that was the result of systems research looking at the
deficiencies of tcp. Another area is consensus algorithms. We now have large
scale real life deployments of consensus algorithms, for example Spanner and
etcd.

The late 90’s and early 2000’s were a weird time where the hardware was
improving so fast and taking software along for a free ride that a lot of
software was good enough. Now, as we bump more into the end of Moore’s Law, we
will be seeing more research and real life usage of multicore and heterogenous
computing and libraries and languages and operating systems that try to make
that easier.

~~~
dgellow
> The late 90’s and early 2000’s were a weird time where the hardware was
> improving so fast and taking software along for a free ride that a lot of
> software was good enough.

Would you say that wasn’t the case during the past 20 years (2000-2019)? Or do
you consider all that period to be “early 2000’s”?

~~~
RcouF1uZ4gsC
I think Herb Sutter's 2005 article "The Free Lunch Is Over" was a signal that
this was changing.

[http://www.gotw.ca/publications/concurrency-
ddj.htm](http://www.gotw.ca/publications/concurrency-ddj.htm)

------
JdeBP
M. Pike's followup, 4 years later, is at
[https://interviews.slashdot.org/story/04/10/18/1153211/rob-p...](https://interviews.slashdot.org/story/04/10/18/1153211/rob-
pike-responds) .

------
w8rbt
_" Linux may fall into the Macintosh trap: smug isolation leading to (near)
obsolescence."_ Well, that did not happen.

------
apta
This is how you end up with a language like golang.

~~~
zzzcpan
Well, to be fair, there is almost no actual scientific programming language
research to begin with, so anything goes.

------
JdeBP
Would research into side-channel attacks count under M. Pike's criteria?

Granted, whilst it is system- _level_ it is not system _software_. And it has
not yielded demos that people have regarded as _cool_ , rather ones that have
been received by some as horrifyingly worrying.

But it has definitely influenced industry.

~~~
tlb
Great work has been done discovering side-channel attacks, but on the other
hand most side channels have been created by sloppy microarchitectural design
since 2000. So I dunno if that's progress. If we see some CPUs in the next few
years that are both fast and not vulnerable, I'll count that as progress.

------
wayoutthere
This article predates it, but OS X (particularly after it mutated into iOS)
represents probably the biggest source of systems innovation in the two
decades after this article. Apple is very secretive, so their systems research
often isn’t known outside the company until it’s actually going into a
product.

OS X was modern for its time, but where they’ve really pushed the envelope is
with iOS. They can simply move faster at scale than anyone else because they
almost entirely own the IP for both the software and all major hardware
components and can pivot on a dime compared to market-based coordination.

~~~
kllrnohj
At launch Android was way more innovative at a systems level with per-
application UID sand-boxing & a permission system and system-integration
capabilities (broadcasts, services, intents, etc...)

iOS was innovative at a UI/UX level, definitely. But I can't really think of
anything they did at a systems level that was at all innovative?

~~~
naasking
Android didn't innovate those. Per-application sandboxing was the default in
capability systems since the 60s, and became more widely deployed in the OLPC
Bitfrost security model, and even had a deployment in HP labs' Polaris Windows
NT-based environment for virus safe computing. These two projects informed the
early Android security model IIRC.

------
elchief
Hadoop? Spark? Are those not systems software?

~~~
syn0byte
Beowulf? MPI? Were these not already things 30 years ago?

[https://github.com/intel/spark-mpi-adapter](https://github.com/intel/spark-
mpi-adapter)

Oh look, a paper; "For example, a recent case study found C with MPI is
4.6–10.2×faster than Spark on large matrix fac-torizations on an HPC cluster
with 100 compute nodes"

Does it sound like large data analytics would have horribly stagnated?

------
ChrisRus
Soon it will take less time and be more cost-effective to commission the
integration of an SoC for your application than to risk your business to
software basket weavers.

------
phtrivier
I genuinely don't know what the author is refering to.

I'm amazed at how many comments resolve around "But wait, of course systems
research has evolved, see XX and YYY", followed by responses along the lines
of "Nah, he was not talking about XX and YYY, rather ZZZ, etc..."

I hate being the "please define xxx" guy, but is there a consensual definition
of what "Systems software" is ?

~~~
yingw787
Rob defined systems software in the first part of his post as "Operating
systems, networking, languages; the things that connect programs together.".

------
perfmode
RAMCloud isn’t popping, but it did give us RAFT.

