This file seems to be encoded as Windows-1252, but is served as UTF-8; hence all the tofu. Browsers all used to have an option override the encoding, but unfortunately those are gone now. Firefox still has the ability to "Repair Text Encoding" in the View menu (you've to press Alt to show the old-fashioned menu bar), which seems to do a decent job on this page.
> t is the way I think. I am a very bottom-up thinker. If you give me the right kind of Tinker Toys, I can imagine the building. I can sit there and see primitives and recognize their power to build structures a half mile high, if only I had just one more to make it functionally complete. I can see those kinds of things.
I’m no Ken Thompson, but this resonates with me. I like to find the simplest, most powerful and general solution, then apply it to the problem at hand.
Super complex frameworks and systems are the bane of my existence, because I really can’t wrap my head around what they are doing, so I can’t trust them.
I feel exactly the same. The history of computing is full of really simple alternative models that didn’t make it into mainstream. So, the easiest way to imagine what future tech could look like is by looking at the past. The state we are in, I cannot imagine how one could come up with something truly elegant on top of foundations plagued with irreducible complexity.
It’s generally the case that complex frameworks are “the simplest, most powerful and general solution,” since it is usually the case that achieving a powerful and general solution requires a great deal of irreducible complexity.
In fact I think what Ken is talking about here is using those simple tools to generate effective purpose built solutions rather than general multi-tools.
>"I think that computer science in its middle age has become incestuous: people are trained by people who think one way. As a result, these so-called orthogonal thinkers are becoming rarer and rarer."
His prediction is in response to a question asking if Linux is "following in the tradition" of Unix, and immediately after, the conversation turns towards a discussion of his (at the time ongoing) work on Plan 9. "Changed his mind" seems like a kind of weird phrasing to me because Linux being more successful than Plan 9 in 2023 isn't a matter of opinion, it's a recognition of reality.
After having gotten familiar with Plan 9, I would have thought this at the time too, because it's a better system for the modern world in every single way imaginable than any Unix ever will be. Unfortunately, he did not account for the fact that adoption isn't based on rational processes. The market is not a meritocracy.
Maybe if Plan 9 was released as FOSS in the 90s and somebody made an aggressive push to give it gas in some hacker circles, history would have taken a different path. But that didn't happen and so we live in a broken future. At least 9front is usable for a daily driver these days, I suppose.
Go is the proper succesor to C and Plan9/9front, to Unix.
Most of the POSIX cruft does not exist in Plan9/9front. It has a much simpler API. There's NPE, ok, but as an emergency to port some monolitical software from Unix, but not so big. Such as Nethack/Slashem, or Netsurf.
I don't understand this part, can someone entangle this? What is the concept of remoteness here?
"Probably the glaring error in Unix was that it underevaluated the concept of remoteness. The open-close-read-write interface should have been encapsulated together as something for remoteness; something that brought a group of interfaces together as a single thing�a remote file system as opposed to a local file system."
My interpretation is he's lamenting that certain Unix OS abstractions (e.g. open, close, read, and write) do not lend themselves well to building distributed systems, like a distributed filesystem. Plan9, for example, designed it's API with such possibilities in mind.
The thing is that Unix grew up w/o a network. Disk was slow though, so perhaps it could have had async I/O from the beginning, but... everything was slow, so maybe not. Most importantly the key is that when one is building something new, often one will build the easy stuff first, and synchronous metadata operations is definitely easier than async -- this is a lot easier to forgive 50 years ago than now because now we know that async matters a great deal, but 50 years ago it was a lot less clear.