
Preventing the Collapse of Civilization [video] - dmit
https://www.youtube.com/watch?v=pW-SOdj4Kkk
======
Illniyar
At what point in the past were our programs stable, robust and just worked?
Perhaps it was before my time, but DOS and windows (3.11, 95) would crash,
constantly. Blue screens of death, infinite loops that would just freeze my
computer, memory leaks that would cause my computer to stop working after a
day.

I now expect my computer to stay on for months without issues. I expect to be
able to put it to sleep and open it in the same state it was. I expect that if
a website or a program errors, my OS simply shrugs and closes it. I expect my
OS to be robust enough so that if I enter a usb or download a file I'm not
playing Russian roulette that it might contain a virus that would destroy my
computer.

In the past I would close my computer at the end of every day because
otherwise it will simply crash some time in the night. I would run de
fragmentation at least once a month. Memory errors and disk errors were
common, and the OS had no idea how to overcome it. Crashes were so common, you
just shrugged and learned to save often.

~~~
sdfjkl
> At what point in the past were our programs stable, robust and just worked?

DOS was rock solid, at least around the era of DR-DOS. DESQView 386 was
absolutely stable too. The BBS software I ran on them in those days was a
wobbly piece of shit though.

I also recall Borland's Turbo Pascal compiler and the accompanying text-mode
IDE being ultra reliable.

After DOS I've used OS/2 which was also extremely stable, although suffered
from limited hard- and software availability.

Mac OS X used to be rock solid too in the heydays of the Powerbooks and
earlier Intel Macbooks. Every now and then there were hardware design flaws
though, and now the quality of both soft- and hardware seems to have taken a
tragic turn for the worse.

You still do play Russian roulette whenever you plug a USB device into your
computer, see "USB Rubber Ducky".

~~~
messe
> DOS was rock solid

DOS was rock solid because it did nothing. Many programs, particularly games
and anything that did networking, didn't even use DOS interfaces—they bypassed
them entirely and worked with either the BIOS or hardware directly. There was
no memory protection, no multitasking, and, on a higher level, no permissions
nor sandboxing. So while maybe it "just worked", I wouldn't call it robust.

~~~
clord
You’re complaining that DOS lacked features and protections. But zero cost
abstractions like DOS can still be robust (in the sense of being reliable and
solid.) yes it requires more trust, But there is no guarantee modern OSes can
run arbitrary zero-trust binaries.

~~~
ynnn
If you run a buggy program on a modern OS, it won't crash the system or impact
other processes. If you run a buggy program on DOS, it will write to random
physical addresses, probably clobbering the state of other processes and of
DOS.

Modern OSes can run arbitrary binaries, but they can pretty much run arbitrary
non-adversarial binaries - problematic binaries have to be intentionally
written to exploit the system (as opposed to DOS, where non-problematic
binaries had to be intentionally written to not break the system).

It's a dramatic improvement.

~~~
mmphosis
It doesn't matter what OS I run. My "modern", and apparently buggy, CPU runs
arbitrary systems that I know very little about, and I have little to no
control over.

Since 2008, it's been a dramatic departure.

------
austincheney
As a JavaScript developer I strongly resonate with the quote at 14:50 into the
video. In summary all of the silicon industrys' chips at the time were full of
defects and often the same defects between various vendors. The industry was
completely aware of this. The problem is that the original generation of chips
were designed by old guys who figured it out. The current generation of chips
(at that time) was designed by youngsters working in the shadow of the prior
generation and not asking the right questions because they were not aware of
what those questions were.

A decade ago JavaScript developers had little or no trouble working cross
browser, writing small applications, and churning out results that work
reasonably well very quickly. It isn't that cross browser compatibility had
been solved, far from it, but that you simply worked to the problem directly
and this was part of regular testing.

That older generation did not have the benefit of helpful abstractions like
jQuery or React. They had to know what the APIs were and they worked to them
directly. The biggest problem with this is that there weren't many people who
could do this work well. Then shortly after the helpful abstractions appeared
and suddenly there was an explosion of competent enough developers, but many
of these good enough developers did not and cannot work without their favorite
abstractions. These abstractions impose a performance penalty, increase
product size, impose additional maintenance concerns, and complicate
requirements.

The ability to work below the abstractions is quickly becoming lost knowledge.
Many commercial websites load slower now than they did 20 years ago despite
radical increases in connection speeds. To the point of the video this loss of
knowledge is not static and results in degrading quality over time that is
acceptable to later generations of developers who don't know the proper
questions to ask.

~~~
arendtio
Actually, I don't think that abstractions are the problem. I mean, the whole
OSI model is made of abstractions. Abstractions are at the core of software
development.

And I also don't think that Jquery is the problem. Jquery just made JS worth
learning. Before you had to spend an insane amount of time just working out
implementation specifics that changed every few months.

However, the point where I do agree with you is that we have a performance
issue with JS. And I am not talking about slow JS engines. I am talking about
developers who are not aware of how costly some operations are (e.g. loading a
bunch of libraries). Yes, that is an issue that naturally arises with
abstractions, but to conclude that abstractions themselves are the problem is
wrong.

I think the problem is more about being aware of what happens in the
background. You don't have to know every step for every browser and API, but
loading 500KB of dependencies before even starting your own scripts is not
going to be fast in any browser.

~~~
austincheney
The abstractions aren't making this problem the developers without a
willingness to work under the abstractions are.

I wrote this following tool in less than 90 minutes 5 years ago because I had
some time left over before presentations at a company hack-a-thon. I updated
it recently at about another 90 minutes of time.

[https://github.com/prettydiff/semanticText](https://github.com/prettydiff/semanticText)

That tool was trivial to write and maintain. It has real utility value as an
accessibility tool. I could write that tool because I am familiar with the
DOM, the layer underneath. Many developers are not even aware of the problems
(SEO and accessibility) this tool identifies much less how to write it. jQuery
won't get you the necessary functionality.

~~~
mercer
Why are so many of your comments of a gate-keeping, self-congratulatory
nature, or blanket criticisms of others?

I find it frustrating because I _do_ think you have plenty of valuable stuff
to say, but you're consistently presenting it in a rather unappealing package
and I don't really understand what you get out of that.

~~~
austincheney
Gate-keeping yes, self-congratulatory no. This subject is personally sensitive
to me because it has immediate real world consequences that impact my
livelihood and choice of employment.

I am away from home on a military deployment at this time and I am constantly
thinking about what I should do when I return to the real world. I am actively
investigating career alternatives by dumping software development, because I
honestly believe my career is limited by the general unwillingness of my
development peers to address actual engineering concerns out of convenience
and insecurity.

------
jmiskovic
Great talk. I agree that sw is on the decline. You can see it in your OS, on
the web, everywhere. Robust products are replaced with 'modern' crappy
redesigns. We are surprised if the thing still works after 5 years.

I don't agree on his conclusions. The real source of problem is that now we
have maybe x100000 more software than we had in 70s. It's that many more
programmers, so not just the 1% smartest greybeards as before. We need more
abstractions, and yes, they will run slower and have their issues.

Also, not everybody is sitting at the top of hierarchy of abstractions. Some
roll up their sleeves and work on JIT runtimes and breakthrough DB algorithms.

All those blocks of software need to communicate with the platform and between
them. IMO the way out is open source. Open platforms, open standards, open
policies. Every time I found a good piece of code in company's huge codebase,
it was open source library. Every time. You have to open up to external world
to produce well engineered piece of software. The lack of financial models for
open source is the obstacle. We should work on making simple and robust
software profitable.

~~~
imiric
> The real source of problem is that now we have maybe x100000 more software
> than we had in 70s. It's that many more programmers, so not just the 1%
> smartest greybeards as before.

The greybeards from the 70s weren't much smarter than today's programmers.
They were the same curious hackers from today, with the advantage of being
born in the right place at the right time, when the technology was still
developing, so they were forced to build their own tools and operating
systems.

> We need more abstractions, and yes, they will run slower and have their
> issues.

I disagree, and side with Jon Blow on this: abstractions (if done well) create
the illusion of simplicity and more often that not hide the apparent
complexity of lower levels. Sometimes this complexity is indeed too difficult
to work with, but often it's the problem itself that needs to be simplified
instead of creating an abstraction layer on top.

I think as an industry we've failed to make meaningful abstractions while
educating new programmers on the lower level functionality. A lot of today's
programmers learned on Python, PHP, Ruby, JavaScript, etc., which are
incredibly complex tools by themselves. And only a minority of those will end
up going back and really learning the fundamentals in the same way hackers in
the 60s and 70s did.

> IMO the way out is open source. Open platforms, open standards, open
> policies.

Agreed. But education and simplification are also crucial.

~~~
jmiskovic
The argument wasn't that people were smarter back in 70s. Instead, computers
were rare, only the most motivated individuals could reach them. They were
curious hackers as you say, and today's curious hackers are just as good. They
form maybe 1% of programmer population.

Regarding abstractions, I feel you are talking about leaky abstractions -
systems that offer a simplified interface but still manage to burden you with
all the implementation details. It's hard to identify such poorly engineered
building blocks until it's too late. Thus layers get built on top of them and
it becomes too costly to go back and rework the stack. This is a problem with
development processes favoring new features over paying the accumulated tech
debt.

Still, good building blocks can exist. You can (and often must) have
complexity if it's properly isolated and does not leak. I'd say JVM is great
example of abstraction that is slower and internally more complex than SW done
native, but brings much to the table as a platform to build on. Other
examples: BeamVM, ZeroMQ, Lua (leaky, but at least very simple). Browsers
unfortunately are too burdened with legacy and security issues.

I feel my formal education has failed to teach me how to design proper
interfaces between systems. Instead we are taught pointer arithmetic and
mainstream OOP ("cat IS an animal").

------
js8
It seems to me that there is a cultural problem that deep expertise is not
valued, because it is difficult to understand and to get somebody "flexible"
is easier.

I was just at a workshop about
[https://en.wikipedia.org/wiki/Design_thinking](https://en.wikipedia.org/wiki/Design_thinking).
The whole premise was that you don't actually need to hire an (expensive,
inflexible) expert, who understands how something is done, but rather what you
need to do is to "observe" an expert.

But imagine what happens when everybody does that! Everybody gets rid of their
experts, assuming that the client (who they are supposed to provide the
service for) has the actual expertise. And they are assuming the same about
their clients and so on. The end result is complete disregard for expertise.

So expertise is a positive externality, in an economic sense. Nobody is
incentivized to keep it more than neccessary. This leads to losses over time.

~~~
pixl97
This is very common in industry with boom/bust cycles. Lots of experts in the
boom, they leave when the bust comes, then when the next boom comes there are
lots of problems expanding said processes quickly because of lack of
expertise.

------
roenxi
The risk is that things 'just work' for extended periods of time and the
maintainers are optimised out of the system because they aren't needed in the
short term.

My personal guess at why civilisations can collapse so slowly (100s of years
for the Romans, for example) is that the people who maintain the political
systems do too good a job, and so the safeguards are forgotten.

For example, after WWII the Europeans learned some really scary lessons about
privacy. The Americans enjoyed greater peace and stability, so the people with
privacy concerns are given less air time in places like Silicon Valley or
Washington. The two-step process at work here is that when things are working,
standards slip and the proper response to problems are forgotten. Then when
things don't work, people don't know what to do and the system degrades.

~~~
nabla9
Sean Carroll's podcast Episode 37: Edward Watts on the End of the Roman
Republic and Lessons for Democracy has good discussion about this (there is
also transcript)
[https://www.preposterousuniverse.com/podcast/2019/03/11/epis...](https://www.preposterousuniverse.com/podcast/2019/03/11/episode-37-edward-
watts-on-the-end-of-the-roman-republic-and-lessons-for-democracy/)

Basically there are norms and unwritten understanding, deeply understood ideas
about what is not acceptable. Rulers don't push their power to full extent.
Then someone comes along and starts to push them and gradually what is
acceptable changes.

------
d_burfoot
I loved this talk, and in particular the point about programmers being forced
to learn trivia instead of deep knowledge. I just started a new job at a big
tech company, and I've spent a whole week so far trying to figure out how to
use the build tool. The frustrating part is that most of the software modules
my team is working on aren't very complicated. The complexity comes from
pulling in all sorts of 3rd party libraries and managing their transitive
dependencies.

------
glandium
He briefly mentions the Boeing 737 MAX issue, but he understates the problem.
Sure there was a software problem, but the urderlying issue was the whole
notion that everything can be "fixed" (worked around) by software. That it's
fine to make changes to the plane aerodynamics and compensate with software so
that it would seemingly act like the previous model.

~~~
paulkon
I'm reminded of this explanation that was posted on HN couple months back
[https://twitter.com/trevorsumner/status/1106934362531155974](https://twitter.com/trevorsumner/status/1106934362531155974)

Seems like it was a physical engineering and business problem and software was
added to (inadequately) compensate.

------
magicbuzz
He obviously uses Windows for his anecdotal examples, but I don’t think you
can point the finger at Windows specifically. I think it’s consistent
throughout OSes. I see regressions in iOS since the earlier versions, as well
as in Linux and the applications I use.

My IDE stopped providing menus. It’s open source so I just shrug and track the
issue in Github.

------
reactspa
Fantastic talk.

Portable Apps on Windows are a hedge against some of the angst he describes.
(E.g. the part about updates changing a lot of things around or causing
failure-to-launch problems).

E.g., I still use WinAmp to play mp3's. It's a portable version, doesn't need
installation (so I can use it on my locked-down work computer). The UI hasn't
changed in 20 years. Newer file-formats can be played after adding plug-ins.

I've put together a whole bunch of Portable Apps, and nowadays I first try to
find a portable version of an app I need before a non-portable version.

~~~
stallmanite
100% concur re: portable apps. It’s a better way to live from an end-user
standpoint in my experience. Portableapps.com and librekey both are excellent

------
arketyp
The repeal of Moore's law will be a blessing in disguise, I think. Not only
will programmers need to get clever in a traditional sense but, also, a new
era of specialized hardware will require a more intimate understanding of the
bits and pieces. I'm optimistic.

~~~
philwelch
I’m actually looking forward to the end of Moore’s Law. Instead of keeping up
with a rapidly shifting landscape of new and creative ways of wasting CPU
cycles, we can maybe build things that have a chance of lasting a hundred
years.

~~~
pixl97
Even after Moore's we are going to have a shifting landscape of security
changes in processors that has been neglected for some time.

------
std_throwaway
At my workplace when I ask about why some process parameters are this way it
usually leads to a dead end where the people who know are long gone and those
who should know don't know the essentials. Everything is kind of
interconnected and errors show up months later so you can't really change
anything on any machine until the machine as a whole breaks and needs to be
replaced. Then you try to get it to work somehow and those parameters are then
set forever.

------
lalalandland
The general purposeness of computers is the reason for complexity. We use the
same systems to do highly secure and critical business transaction as high
performance simulation, playing and fun. The conveniency of not having to
switch systems when doing different tasks is adding a lot of complexity.
Special purpose hardware can by its nature of not being general, be much
simpler and omit a lot of the security and complexity. But it's much less
convenient and much less flexible.

------
cryptica
I think it's because companies always try to commoditize software developers
but it doesn't work. You can't replace a good software developer with 10
mediocre ones plus thousands of unit tests. The only way to become a good
software developer is with experience.

------
adverbly
I remember seeing the foundationDB distributed systems testing video for the
first time, and being blown away with what it takes to build robust software.
Worth a view if you haven't seen it.
[https://www.youtube.com/watch?v=4fFDFbi3toc](https://www.youtube.com/watch?v=4fFDFbi3toc)

Would love to see more things in this direction, but, I agree that the market
doesn't want it. Most users will gladly accept an infrequent bug for an
earlier release, or lower cost version of a product.

------
new4thaccount
Reducing all this complexity is partly why I'm hoping the Red Language project
can succeed where Rebol failed.

Of course you can't do everything, but a good full-stack language could cover
perhaps 80% of software needs using well written DSLs. The simple fact that we
have so many languages targeting the same thing is a waste and duplicative
effort (Java, C#, Kotlin, Scala, Clojure, F#...etc) for business apps and
(Python, Matlab, Julia, R, and Fortran) for data science and scientific
programming. Also systems languages like (C, C++, Ada, Rust).

On one side it is good to have purpose built languages, but on another it puts
a big barrier to entry.

Note that I'm advocating for abstractions, but far fewer languages. Yes,
abstractions add complexity, but actually make the code more readable. I
shudder to think of humanity having to maintain and support ever increasing
levels of software.

~~~
d_burfoot
I absolutely agree that we actually need fewer languages. The languages we
have today really are good enough for the vast majority of programming work.
To the extent that they fall short, the solution is to either improve the
language or to build good libraries for it.

~~~
benc666
Many programming languages start out as 'experiments' by their creators, to
combine or extend the capabilities and attributes of some prior ones.

That seems like a very healthy evolutionary approach to me.

------
cheschire
This whole presentation feels like a great candidate for a meta analysis on
the effects of recency bias on analysis.

------
earenndil
I challenge his assertion around 32:50 that something is lost. I've done
assembly programming. C programming. I might venture to say I'm pretty good at
it. I even dabbled a bit in baremetal programming, was going to make my own
OS, but lost interest. Wanna know why? Take a look at this[1] article. Yep.
If, on x86, you want to know what memory you're allowed to access; how much of
it and where it is, there is literally no good, or standard way to do that.
"Well," you (or jon blow) might say, "just use grub (or another multiboot
bootloader), it'll give you the memory map." But wait, wasn't that what we
were trying to avoid? If you do this, you'll say "I'm smart, I'm sparing
myself the effort," but really there is a loss of capability, you don't really
know where these BIOS calls are going, what the inner workings of this
bootloader are, and something is lost there.

This is a bit of a contrived and exaggerated example, but it serves to prove
my point which is that these things really do scale linearly: you give up the
same amount you get back by going up a layer of abstraction (in
understanding/productivity, not talking about performance yet). Low-level
programming languages aren't more productive than high-level programming
languages. Low-level _programmers_ are more productive than high-level ones
because it takes more discipline to get good at low-level programming so the
ones that make it in low-level programming are likely to be more skilled or,
at least, to have acquired more skill. Think about the story of mel[2]. Does
anyone honestly think, with any kind of conviction, that mel would have been
less productive had he programmed in python and not thought about how machine
instruction would be loaded?

As I've mentioned, I have done, and gotten reasonably good at, low-level
programming, and yet my current favourite language is perl6. A language that
is about as far from the cpu as it gets, on a par with javascript or haskell.
Why? Because _nothing is lost_. Nothing is lost, and quite a lot is gained.
There are things I can do with perl6 that I _cannot do_ with c—but, of course,
the reverse is also true. And I think that jon blow's perspective is rather
coloured by his profession—game development—where performance is important and
it really does pay, sometimes, to think about how your variables are going in
memory. He has had, I'm sure, negative interactions with proponents of dynamic
languages, because he sees their arguments as (maybe that's what their
arguments are, I don't know) "c is useless, javascript is good enough for
everything." Maybe the people who truly think that have lost something, but I
do not think that mel, or jon blow, or I, would lose much by using perl6
instead of c where perl6 is sufficient.

1:
[https://wiki.osdev.org/Detecting_Memory_(x86)](https://wiki.osdev.org/Detecting_Memory_\(x86\))

2: [http://www.catb.org/~esr/jargon/html/story-of-
mel.html](http://www.catb.org/~esr/jargon/html/story-of-mel.html)

~~~
NeveHanter
About the first one, that's also another problem, BIOS doesn't have standard
protocol, if there would be such, there would be one standardized way to
detect the memory layout.

About the second one, performance should be crucial everywhere, if some
application eats all the resources then I can't have other applications
working in the background doing their stuff. That's the problem with i.e.
"modern" communication apps (I'm talking about you, slack) where my four core
CPU is on it's knees when doing simple things like switching the team or even
channel, not mentioning starting the app itself. Another one is when I'm on
the Google Meet chat, my browser eats 80% of CPU, I can't do anything reliably
in that time, running anything makes chat to loose audio, lag a much, etc.

Going back some years ago, I was able to run Skype, AQQ, IDE, Chrome browser
and Winamp at the same time on archaic (in today's standards) i3-350M and 4
GiB of RAM.

~~~
Ace17
> About the first one, that's also another problem, BIOS doesn't have standard
> protocol, if there would be such, there would be one standardized way to
> detect the memory layout.

It's the same for almost all hardware: graphics cards, sound cards, all use
their own register map (which, to make the matter worse, is often kept
secret). Even USB stuff, which is supposed to be already homogeneized (i.e USB
classes) often requires sending vendor-specific "quirk" strings to get the
hardware working (e.g the list of 'quirks' in the snd-usbmidi Linux kernel
module).

The hardware diversity isn't a detail here. It's the root of the "too many
abstractions" problem, and I don't think this is something you can avoid. This
is why device drivers exist. This is why operating systems try hard to impose
APIs on device drivers (ALSA, Direct3D, etc.) so Firefox, MS Word, Half-Life
... can run on future hardware.

This is one reason why the abstraction layers exist (I'm not even talking
about memory protection / safety here) : because you don't want to re-release
your app every time some vendors makes a new sound card, graphics card, MIDI
interface, wi-fi chip, etc. And you don't want to code the support for all
this hardware yourself.

------
potrarch
There is a relationship between quantity of functionality and bugginess. Even
with the most demanding testing, bugs will remain. The question is, as more
and more software permeates our lives, will the accumulation of unfixable bugs
ultimately overwhelm us. Can we build an AI that can clean enough of the bugs
out of all of our software, including it's own, for our civilization to
survive?

------
mike00632
It seems like Jonathan Blow completely forgot about the blue screen of death
and how common it was.

------
kzrdude
Now, what would be amazing would be if we had found the Antikythera mechanism
so intact that it could be reconstructed perfectly. And then we'd check
everything it could do, and what kinds of drawbacks or errors the construction
had!

------
earenndil
People don't care about five 9s anymore? It's not _as_ important as it was (I
assume—I wasn't really around at the time), but cloud providers definitely
advertise their number of 9s.

~~~
throwaway2048
Good thing they have regular service outages that mysteriously don't quality
for SLA (or show up on dashboards)

------
PorterDuff
I had a few cocktails and thought about a few points made in the video.

It seems to me that there are two different notions here that are being
conflated:

. A rotting of knowledge over time.

. A variant of Moore's Law. In this case, the idea that the value of
technology, in a particular area, has a decreasing value on the margin.

It's kind of like the notions you see in cliodynamics, that there are a few
interacting sine waves (or some other function) in mass human behavior.

I suppose that the main concept of importance is how it all might mess with
your own personal situation. Personally, I think that the West is in decline,
but that doesn't have a whole lot to do with the quality of software on
internet websites.

------
alexashka
There is no preventing a collapse. Try and enjoy the ride :)

------
jstewartmobile
I was with him until the middle.

Lack of inter-generational knowledge transfer doesn't cut it. Most of the
people who rolled this stuff are still alive. And as for the whipper-snappers:
people don't get very far writing programming languages/video games/operating
systems without knowing their stuff.

The real boogeyman is feature combinatorics. When making a tightly-integrated
product (which people tend to expect these days), adding "just" one new
feature (when you already have 100 of them) means touching several (if not all
100) things.

Take OpenBSD for example: When you have a volunteer project by nerds for
nerds, prioritizing getting it right (over having the fastest benchmark or
feature-parity with X) is still manageable.

Bring that into a market scenario (where buyers have a vague to non-existent
understanding of what they're even buying), and we get what we get. Software
companies live and die by benchmark and feature parity, and as long as it
crashes and frustrates less than the other guy's product, the cash will keep
coming in.

~~~
TeMPOraL
> _When making a tightly-integrated product (which people tend to expect these
> days)_

Do they? It was my impression that the recent evolution of user-facing
software (i.e. the web, mostly) was about less integation due to reduced scope
and capabilities of any single piece of software.

> _adding "just" one new feature (when you already have 100 of them) means
> touching several (if not all 100) things._

This sounds true on first impression, but I'm not sure how true it really is.
Consider that I could start rewriting this as "adding 'just' one new program
when you already have 100 of them installed on your computer"... and it
doesn't make sense anymore. A feature to a program is like a program to OS,
and yet most software doesn't involve extensive use, or changes, of the
operating system.

The most complex and feature-packed software I've seen (e.g. 3D modelling
tools, Emacs, or hell, Windows or Linux) doesn't trigger combinatorial
explosion; every new feature is developed almost in isolation from all others,
and yet tight integration is achieved.

~~~
jstewartmobile
Pretty sure making emacs render smoothly in 2016 was not an isolated change--
even if the code change were only a single line.

[https://www.facebook.com/notes/daniel-colascione/buttery-
smo...](https://www.facebook.com/notes/daniel-colascione/buttery-smooth-
emacs/10155313440066102/)

Same story for speeding up WSL.

[https://devblogs.microsoft.com/commandline/announcing-
wsl-2/](https://devblogs.microsoft.com/commandline/announcing-wsl-2/)

Or, just think about your phone. If I put my head to the speaker, a sensor
detects that, and the OS turns off the screen to save power. If I'm playing
music to my Bluetooth speaker, and a call comes in, it pauses the song. When
the call ends, the song automatically resumes.

KT's UNIX 0.1 didn't do audio or power management or high-level events
notification.

~~~
TeMPOraL
> _Pretty sure making emacs render smoothly in 2016 was not an isolated change
> --even if the code change were only a single line._

This was a corner case. What I meant is the couple dozen packages I have in my
Emacs that are well-interoperating but otherwise independent, and can be
updated independently.

> _Or, just think about your phone. If I put my head to the speaker, a sensor
> detects that, and the OS turns off the screen to save power. If I 'm playing
> music to my Bluetooth speaker, and a call comes in, it pauses the song. When
> the call ends, the song automatically resumes._

These each affect a small fraction of code that's running on your phone.
Neither of them is e.g. concerned with screen colors/color effects like night
mode, or with phone orientation, or with notifications, or countless other
things that run on your phone in near-complete independence.

~~~
jstewartmobile
Buttery smooth emacs is not a corner case. When working on API-level things
(for those who don't know--emacs is practically an operating system), if we
care about not breaking things, one must be highly cognizant of all the ways
that API is consumed. Even if the final change ends up being very small code-
wise, the head-space required to make it is immense. That is why we have
(relatively) so few people who make operating systems and compilers that are
worth a damn.

Most emacs plug-ins aren't operating at that level of interdependence. This
one is working on a text buffer at t0, and that other one is working on a text
buffer at t1. Of course, the whole thing can be vastly simplified if we can
reduce interdependence, but that is not the way the world works. Typical end
user doesn't want emacs. Typical end user wants MS Word.

Even if I accepted the replacement of my word "feature" with your word
"program" (not that I do), one only needs to look at Docker's prevalence to
see the point still holds. Interdependence is hard, and sometimes unavoidable.

------
soup10
bravo, great talk Jon. but real talk can you make another Braid already

------
fallingfrog
Not to pick on Microsoft specifically too much, but I remember seeing the
hello world program for windows 3.1 for the first time and thinking, “this is
not looking good.” And I was right.

------
lolc
Wow he has a rosy picture of the past. I don't see where he gets the five
nines from. He doesn't even quote anybody on it. Most of the examples he gives
would have been zero nines back in the day because they were not available at
all!

Wikipedia for example has one nine availability in my life. Because when I
sleep my phone is still on.

