Hacker News new | past | comments | ask | show | jobs | submit login
Unix: A History and a Memoir, by Brian Kernighan (princeton.edu)
502 points by f2f 25 days ago | hide | past | web | favorite | 139 comments



I've always felt we don't appreciate history very much in our industry.

I've been lucky enough to work and hang out with some of the co-founders of very impactful projects, such as OpenStack and Cloud Foundry, and there are so many stories I've heard that I'm sure would be insightful and valuable lessons for whomever is embarking on new ideas. And yet, we all move so fast, that there is no time to stop and write them down.

I'm glad BK did. UNIX is foundational to essentially all software-driven technology today, in one way or another. His book (along with Dennis Ritchie) on the C Programming Language made a huge impact for me as a CompSci student in the 80s, as did UNIX itself (Ultrix and DG/OS were my fist UNIX variants).

I look forward to read his book.


There's also the unavoidability of narratives, and how they influence what people look up to begin with. For example, there's a Unix history narrative which begins at Bell Labs goes to Berkeley and then out to the world; this is already extremely limited, in that it ignores Wollongong (where the first Unix port was done, to the Interdata/32, and where important work on TCP/IP networking was done) and what AT&T did with Unix after they closed the sources and what the Research Unix people were up to after Seventh Edition, but I think the biggest loss is that it completely sells Multics short: Unix began when Bell Labs left the Multics project, so Multics, in this narrative, is frozen in time as this unfinished thing that Our Heroes are already bailing out of, and that's what gets handed down, as if Multics never progressed an inch beyond 1969. Heck, you can even see this as Myth #1 on the multicians.org site:

https://multicians.org/myths.html

> 1. Myth: Multics failed in 1969. Bell Labs quit, Multics survived.

Now that we can use Multics about as easily as we can use Ancient Unix versions under emulation, you can spin up a perfectly functional 1980s-era Multics and see that, no, really, Multics evolved into something you can do stuff on.

That's the problem with narratives: They're both inevitable and inevitably limiting, narrowing the focus to what makes a comprehensible story as opposed to a day-by-day list of what happened. Humans create narratives as naturally, and as unavoidably, as breathing, but we have to be aware of what they do to our comprehension of history.


Specially annoying with that narrative, for those that care about computing history, is as C and UNIX are sold as the first of their kind, invented on magic moment, hand-waving what everyone else was doing.

Since history belongs to winners, if it wasn't for the accessibility of old conference papers and computer manuals, that would be indeed the only one we had to believe on.


I always thought one of the most interesting insights into this was the Unix Haters Handbook https://web.mit.edu/~simsong/www/ugh.pdf especially Dennis Richie's anti forward:

The systems you remember so fondly (TOPS-20, ITS, Multics, Lisp Machine, Cedar/Mesa, the Dorado) are not just out to pasture, they are fertilizing it from below.

Your judgments are not keen, they are intoxicated by metaphor. In the Preface you suffer first from heat, lice, and malnourishment, then become prisoners in a Gulag. In Chapter 1 you are in turn infected by a virus, racked by drug addiction, and addled by puffiness of the genome.

Yet your prison without coherent design continues to imprison you. How can this be, if it has no strong places? The rational prisoner exploits the weak places, creates order from chaos: instead, collectives like the FSF vindicate their jailers by building cells almost compatible with the existing ones, albeit with more features. The journalist with three undergraduate degrees from MIT, the researcher at Microsoft, and the senior scientist at Apple might volunteer a few words about the regulations of the prisons to which they have been transferred.


FYI if anyone visits Sydney and is in to the novel experience due to an interest in Unix history, you can request a window seat on the right side of the plane to typically see Wollongong[0] down the distant coast (past the Royal National Park[1]) slightly after takeoff.

[0] https://en.wikipedia.org/wiki/Wollongong [1] https://en.wikipedia.org/wiki/Royal_National_Park


Speaking of Multics, I am reading "IBM's 360 and Early 370 Systems". In passing, the narrative mentions an early public rejection of S/360 by MIT and Bell Labs because it didn't have hardware-aided dynamic address translation used in time sharing. Instead, they went with a GE 600 series which lead to the development of Multics.


For what it's worth, this is one of the topics Kernighan's book covers. He goes over the breakup in pretty good detail and does talk about the further evolution of Multics.


The Computer History Museum in San Jose is a great place. I spent an entire day there.

They have an excellent YouTube channel with thousands of hours of interviews with important people in the industry. They posted an interview with Brian Kernighan last week.

https://www.youtube.com/user/ComputerHistory/videos

https://www.youtube.com/watch?v=bTWv-l0JhAc


Slight correction: The Computer History Museum is located in Mountain View.


Thanks! I am visiting San Francisco soon and will definitely go check the Computer History Museum.


These interviews are so good.


Absolutely agreed. It's a shame we talk past each other, when so many answers were already discovered and documented in the past.

I try and keep the following in mind (even when my ego fights it).

(A) I am very unlikely to be the smartest person who's ever thought about a problem. (B) Even if their technology differed, some parts of their approach are useful. (C) Clues and references pop up in the oddest places, so ears open and notepad ready. (D) Accept references graciously, in the spirit of a gift. (E) Name dropping historical references without providing actual context is always in poor taste.


[flagged]



I think the person in question is stating that the person they're responding to _shouldn't have to remind themselves of those things_


I've always felt we don't appreciate history very much in our industry.

Why do you feel this way? ACM has had a History of Programming Languages conference since the late 70s where the well known History of Lisp paper was presented. All sorts of popular accounts are, well, popular. Soul of A New Machine was published in '81, Hackers in '84, Accidental Empires in '92, just as a few examples off the top of my head.


I love the ACM and everyone trying to make the effort to preserve our history.

I see it from the perspective of someone working at a fast-moving, Silicon Valley-based software vendor. In that context, I don't see a lot of appreciation for history, in the day-to-day basis.

But it's not because people intrinsically don't care about it. It's more about our daily lives. We not only have to do what we all have to do, but we also have to keep up with the latest if we want to survive here. That line of thinking constantly pushes us forward, and doesn't reward looking at the past.


I suspect you might be extrapolating too far from the fact that you and your colleagues are, perhaps, exceptionally busy people.

If anything, the great speed of the industry's development has made it particularly concerned with its own history - things become history quickly, the principals are typically still around, etc. The ACM HOPL conference I mentioned started when high level languages had a total history of barely over 20 years. When McCarthy (who was surely also a very busy person) wrote his History of Lisp paper, Lisp was younger than Java is today. And as I said in a sibling comment, the interest extends far beyond industry participants - there are many popular accounts aimed at general audiences and many new ones are written at a seemingly increasing rate. There's a Computer History Museum which prides itself on its working exhibits right here in Silicon Valley. Industry history-related articles are hugely popular on HN - there's a couple of them on the front page right now.


And best of all, reselling old stuff as having discovered the powder just now.


Believe Natales is speaking more to average everydev's state, than the existence of documented history.

If history exists, but nobody reads it, then are we not still lost?


I mentioned a bunch of popular books, not academic works. One of them had a multi-episode PBS TV show over 20 years ago! Since then, popular accounts of various famous and obscure aspects of industry history, biographies key people, etc have become even more common - I just picked some somewhat older ones.


Popular with the folks around the office?

I don't work in SV, but I like to think I work in a fairly high-skill coding environment. And I regularly have to elaborate on historical references or suggestions.


I'm at a bit of a loss at what to tell you. I feel like we have just about overwhelming evidence that the claim the computer industry is uninterested in its own history is inaccurate. Beyond the tiny smattering of it I've mentioned, that evidence pokes us in the eye every day from the front page of HN. The person who brought this up is apparently too busy to evaluate any evidence and you are telling me the good people at your office know less about computer industry history than you do - a thing that might be as true and as nondispositive about anything ranging from Pokémon to Ming Dynasty Porcelain.


We're discussing apples and oranges.

You're saying computer science history objectively exists.

I'm lamenting a lack of knowledge of it.


No, I'm saying the supporting evidence for such lamentations is much weaker than the evidence for the opposite view.


Ah. In that case, the existence of books (even popular ones) and ACM conferences isn't strong support that most devs are well-versed in computing history, to me.

In the same way that the medical field has similar historical output, but the number of everyday doctors versed in medical history is quite low.

I'll dig around today and see if I can find representative polls related to computer history. Surely SO has asked a question at some point...


Your Pokemon remark seems in the right direction, I think disagreements here are perpetually going to revolve around what one's individual experiences have been and what other experiences one reads or hears most about and how much one lets all that impact what one believes about the "average everydev". At least until the CS departments worldwide take a break from the applied math, walk over to the social science departments, and ask how to start conducting scientific studies about subjective things so that we can collect some stronger evidence one way or another.

In the meantime, how is the (Bayesian) evidence from one's experience to be properly measured next to the existence of books, papers, journals, conferences, videos, some with easily accessible and fairly accurate sales or views numbers to prove a level of popularity? In my office I doubt more than 20 people, perhaps no more than 10, out of the thousand or so spread across the floors (which includes many non-devs) could name over 200 Pokemon. This doesn't negate the existence of millions of people who can do that and more elsewhere, and the millions of dollars in that market. But I think it is at least suggestive about the type of people my company, and companies like mine, tend to hire. Is it really a stretch to extrapolate and make a bet that "perhaps only 2% of salaried devs can name over 200 Pokemon"? How about something just as trivial, the names of all 50 united states?

Pokemon is Pokemon, it doesn't matter much for our field, but where the widespread self-hating of one's profession comes in is that depending on one's experience the bets might not seem to be that different for things more programmers (or other roles like managers) probably ought to know about, their field's own short history just one of them. I recently quipped to someone that I doubt more than 5% of developers are even familiar with the SOLID principles thus it's not very fruitful to ask about them in an interview unless knowing about them is a hard requirement. Even fewer have a good grasp of what each principle means, and even fewer know the trivia that the L is named after a woman (with even fewer still knowing the less trivial nature of what else she contributed to our industry). Maybe I'm over-extrapolating, or a bit pessimistic, maybe I'm letting my enterprise day-job and my conversations with others at places (both enterprise and not) jade me in my expectations for what I think the "average everydev" is like.

My underlying theory is that power laws are everywhere, which means if the distributions on facets like "knows Pokemon", "knows about SOLID", "frequents Hacker News", "knows important bits of computing's history", "writes/buys/reads technical books", et al. both trivial and non-trivial are power laws then the average level would be terribly low compared to the narrow peak on one side of the distribution. The mode, which is probably what influences a lot of personal-experience feelings about the average, is lower still. If your experience is mostly around the peak it may seem otherwise.

Who's to say how much it matters anyway, it's often a pointless discussion / complaint to me. Normal laws are also everywhere, and we know that at least IQ is normal. "Smart but ignorant" is acceptable for many things, and besides, ignorance need not be a permanent condition. One doesn't need to know that Lisp machines in 1990 could do full 3D rigging, editing, and effects (https://www.youtube.com/watch?v=6VmJVNYfxDc) in order to be effective for their Java shop employer.



There’s a lot of good computer history books, but a book about the early days of Unix at Bell Labs has been missing.

Glad it has finally arrived!


There was a UNIX oral history project at Princeton: https://www.princeton.edu/~hos/Mahoney/unixhistory


> I've always felt we don't appreciate history very much in our industry.

FWIW Chris Harrison (https://www.researchgate.net/profile/Chris_Harrison2) taught a compulsory history of computing class at UMIST in the UK when I was there in 1999-2003. This seems pretty unusual, especially now that I live in the US and understand more about how universities work here.


> I've always felt we don't appreciate history very much in our industry.

I agree. At the same time, unless you immerse yourself in oral tradition and try to assemble a picture of the diversity of what was going on from it, it's hard to get a sense of what has happened on a scale larger than a particular community, and sometimes not even then.

I occasionally think about trying to write a one volume history of computing.

> UNIX is foundational to essentially all software-driven technology today, in one way or another.

Except Windows, which comes from VMS, which predates Unix. And SQL, which comes from IBM mainframe land. I could probably come up with a few more, but it's late and I'm tired.


Largely agreed, but VMS by no means predates Unix.

Unix was in development in 1969, had a manual released inside AT&T in 1971 and was announced publicly in 1973. The first source license was sold in 1975.

VMS was released in 1977 as VAX/VMS on the VAX series of computers. Before VMS the DEC hardware ran various other operating systems such as RSX-11, TOPS-10, and optionally AT&T Unix (which was developed on the PDP series, first the PDP-7).


> Unix (...) first source license was sold in 1975.

> VMS was released in 1977 as VAX/VMS on the VAX series of computers. Before VMS the DEC hardware ran various other operating systems such as RTX-11, TOPS-10, and optionally AT&T Unix

Before VMS there was RSX-11 (note RSX not RTX):

https://en.wikipedia.org/wiki/RSX-11

"From 1971[5] to 1976 the RSX-11M project was spearheaded by noted operating system designer Dave Cutler, then at his first project.[5] Principles first tried in RSX-11M appear also in later designs led by Cutler, DEC's VMS and Microsoft's Windows NT.[6][7][8]"

In Dave's words:

https://tech-insider.org/windows/research/1992/11.html

"RSX-11M was introduced in 1973, 18 months after we started building"

Note Dave Cutler and note the years. I'd say the history is comparable.


Thanks for pointing out the typo. I'll fix that.

Note "heavily influenced" doesn't make RSX-11 any more VMS than it makes Multics into Unix, or CP/M into MS-DOS, or the NeXT into a Mac, or an Alto into a Mac or Windows.

Also note that 1973 doesn't significantly predate 1973. Don't confuse the first source license same with the announcement of availability, which I also noted. And Unix was in use internally at AT&T the same year DEC started on RSX-11.


Dave Cutler was the developer of of RSX-11, VMS and the NT.

I don't claim that his entire life work predates the work of somebody else, just that the products he delivered are definitely comparable in its commercial availability with Unix.

I would also be not surprised if his products had more users at these first years than Unix did. To do the history right one should not project the results visible today (or which happened much later) to the history.


The claim I was correcting was that VMS itself predated Unix itself. This claim is false on its face. Yes, multiple people in the history of the field have worked on more than line project.


Also from IBM: Hypervisors/VMs, ancestors of today's containers and SGML, foundation of HTML and XML


I guess you could make the case that windows is written in C++, which is a superset of C which was designed while writing UNIX. Sure the OS is different, but it ended up being built from the same blocks. If we didn't have C, would we have Windows as it is today?


Windows is not written in C++, even if Microsoft did embrace C++ quite a bit with the infamous MFC and so on. Windows was originally written using C and assembly language, using the Pascal calling convention for efficiency. It was originally a layer on top DOS which was written in assembly language. Being a "layer" or not was the subject of a very complex federal antitrust case back in the 90s. You can read "Undocumented DOS" and "Undocumented Windows" for some weird kind of technical investigation thriller which I found weirdly fascinating back in the day.


Since Windows Vista, Windows has been transitioning to C++, hence why they don't care much more about ISO C beyond of what is required by ISO C++ standard compliance.

https://herbsutter.com/2012/05/03/reader-qa-what-about-vc-an...

https://www.reddit.com/r/cpp/comments/4oruo1/windows_10_code...


Looking back at what Xerox PARC and ETHZ were doing with their workstations, maybe that wouldn't have been much of an issue.


Yes the high level language on first Macintoshes was Pascal. Lowest level was assembly.

Pascal was not bad at that role, and it would have been possible to develop all the code which was eventually written in C in Pascal too. At least in the dialects that weren’t made just for education.


Apple's version of Pascal had a lot of non-standard extensions. If you squinted, it was essentially C.


So any system programming language with extensions for hardware access were essentially C? Even those 10 years older than C?


Pascal used var parameters rather than pointers into the stack and supported nested subroutines with lexical scope, limiting them to being passed as downward funargs to ensure safety, and it barely missed requiring tag checks on variant record access (an omission Wirth bitterly lamented). Pascal had pointers, but they were strictly for heap-allocated objects, not var parameters or array iteration. In standard Pascal, array size was part of the array type, so you couldn't write functions that operated on arrays of unknown size, because bounds checking of array accesses would have otherwise required passing an additional length word. That's why the string type in TeX is an index into a humongous string pool array.

I think this illustrates how the philosophy of Pascal differed from that of C.


These are indeed the places where Apple extended their version of Pascal. C-style calling convention support, pointer arithmetic, much better I/O, and a host of things I can't remember. Suffice to say that when I started working at Apple (towards the end of the Pascal era) I was pretty comfortable, even though I'd been doing mostly C for a long time. This was definitely not the language we had to use in college.

(No tagged unions, though. Systems guys are allergic to 'em, apparently).

Apple also added object-oriented features, but I don't count those as addressing anything borken or awkward in the base language.


Well, in the base language you couldn't store function pointers in variables or record fields, since they might refer to nested subroutines that included a context pointer to their parent subroutine stack frame, and invoking them after it had returned would be catastrophic. This “downward funargs restriction” made OO-style programming in standard Pascal very difficult in a way that it isn't in standard C. So I think the OO features, too, aimed at a weakness of the base language—but I admit I never looked at Apple's version.


Apple's Pascal had function pointers, and I'm pretty sure you could still do callbacks to nested procedures with a little trickery.

Ahh, I was right (PDF file here): http://mirror.informatimago.com/next/developer.apple.com/tec...

These are not closures, they're still strictly nested and on the native stack, so the usual crashes happen if you execute a callback to something that has gone out of scope.

Remind me to tell you about "Fork Queues", a solution we had on the Newton for dealing with multiple entities that all wanted to have the main event loop (of which you could really only ever have one, but we cheated).


Oooh, this sounds fantastic!


I don't think there's anything in the nature of C per se that's required for Windows. Any reasonably low-level compiled language would have done the same job; Pascal is potentially the nearest substitute. Windows had to construct its own executable and library formats anyway.


Lisp comes to mind.


I know a bunch of people in the long-term UNIX community and I've always been impresssed that most of them have kept 20+ years of email. This has been helpful in figuring out various historical details.


> I've always felt we don't appreciate history very much in our industry.

That is British-grade understatement right there. :)


Do you mean DG/UX? That was an awesome OS.


Indeed, that is what we used at the campus.


Yes! sorry for the memory lapse.


In our industry's defense, our industry is relatively young. In other words, the industry is too busy making history to appreciate it. I'm sure as time passes, we'll start to appreciate the history of the industry more and more.


> In our industry's defense, our industry is relatively young

If we take ENIAC [0] as the first computer (1945), then our industry is 74 years old. Let’s compare this with another tech industry – the aviation industry. The Wright brothers flew in 1903 [1]. 1903 + 74 = 1977.

I was a kid in the 1970s, fascinated by aerospace and aviation. There was a massive amount of aviation history around (I still have some of the aviation history books I bought then). So I think there has been plenty of time for the history of computing to appear. Even if we added another 15 years to shift the start of the history of computing (software engineering) at the development of COBOL [2] (approx the same as FORTRAN or LISP), there has still been a lot of time.

So I don’t think it’s industry youth / age alone.

[0] https://en.wikipedia.org/wiki/ENIAC

[1] https://en.wikipedia.org/wiki/Wright_brothers

[2] https://en.wikipedia.org/wiki/COBOL


We like to think that, but we do a lot of rediscovery of techniques and approaches that were invented in the 60s/70s.


When it comes to Unix, history and the present are one and the same. We have been locked into the same model, the same OS, for decades now. Unix is being developed through small, incremental changes on top of a base that is massively out of date in 2019.

We appreciate history far too much. Unix is holding computing back, and it needs to go. It served its purpose, but it's done. It represent very few of the things that are important to computers in 2019, and it actively hinders many of them.

Unix needs be respectfully let to go into retirement, and new things given a chance to replace it finally.


Let's have this new, wonderful thing or set of things up and running before we retire what we've got, eh?


We can't, because nobody will even consider giving anything new a chance if it is not Unix.


So you propose what, that the industry as a whole voluntarilty re-enters the dark ages by throwing away its current crop of systems with no replacement in sight ?!

It looks to me like our current crop of unix derivatives and unix-likes is catering for a lot of needs very well. We have windows which is non-Unix.

I think the onus is on you to show that something else could be better, and better enough that the pain of relearning would be worth it.


We are in a dark age at the moment, is what I am saying, and I think we should be working at getting out of it.


That's interesting, because I also feel people don't appreciate computing history appropriately.

If you'd care to learn some real history, I suggest you read The UNIX-HATERS Handbook: http://web.mit.edu/%7Esimsong/www/ugh.pdf

With this book, you'll learn that UNIX and C are nothing admirable and have actually been responsible for successfully destroying much better systems and languages in the popular eye; languages including Lisp, APL, Simula, ALGOL, Smalltalk, and Forth all existed before C; systems such as ITS and Multics addressed concerns UNIX users still suffer under today.

Make no mistake, for all of RMS' admirable qualities, he's basically responsible for UNIX proliferating by copying it for GNU. You also shouldn't look at your modern BSD or GNU system and think this is what UNIX users used decades ago, because for all of their faults, GNU and BSD actually try to produce programs which work correctly and GNU goes much farther than several of the BSDs in this respect. The UNIX attitude is one of getting half the job done and leaving it at that.

In closing, UNIX has no philosophy. The UNIX philosophy is simply brand-named simplicity. The ideas of modularity and simplicity predate automatic computing and recorded history, and yet people will claim you're following UNIX if you write a program which adheres to these basic ideals. Further, those other qualities of this philosophy result in programs that aren't modular, simple, nor beautiful.


The Handbook is a fine document and any computing history fan worth their salt should read it, but it's hardly the best analysis of Unix in a historical perspective.

Blaming (or crediting, however you want to think about it) GNU for proliferation of Unix is anachronistic. RMS has repeatedly said he doesn't care for the design of Unix, but he chose it for the ease of implementation. GNU wasn't even bootable as a stand-alone OS before 1990s and certainly not production ready until Linux was. Using GNU utilities in proprietary Unices was popular, at least since the 90s, but I never heard anyone consider them a "killer app" for Unix.


I used to have a couple of PC World articles about Windows winning the Workstation market, until about GNU/Linux started to be mature enough to allow easy porting of commercial UNIX stuff into them.

Had Microsoft and IBM kept serious about their POSIX compatibility subsystems instead of a bullet point to win government contracts, and history would have played out much differently.


It's certainly an interesting critique, but it also massively oversells itself and is totally lacking in self-awareness if you take it seriously.

Computing is absolutely full of widely deployed working technologies that enable people to get work done all day but that have rough edges. For every one of these, there is somebody saying that you should use <pet technology> instead. Usually with missionary levels of zeal. And yet at no point do they seriously address why people might have good reasons for adopting the allegedly inferior solution.


> And yet at no point do they seriously address why people might have good reasons for adopting the allegedly inferior solution.

With an emphasis on "allegedly" sometimes. For example, C is hugely superior to Python in terms of machine efficiency. Does C support faster development cycles? That's right, you can write code that's practically as fast as tightly-optimized machine code! Does C prevent potentially catastrophic errors? That's right, you can write code that's practically as fast as tightly-optimized machine code! A lot of the True Zealots aren't quite as monomaniacal on a single narrow point, but the lack of ability to see a total solution is diagnostic.

So was ITS better than Unix? Not if you prioritize usability, support for application software, or ability to run on more than a single family of very expensive mainframe computers the world had begun to abandon by the time Unix hit its big growth phase. You can say similar things about LispMs, although they were more usable.


UNIX-haters handbook talks about some weird stuff that naturally felt out of use.

RMS worked on EmacsLisp.


Everyone interested in the history of computing should read The Dream Machine by M. Mitchell Waldrop. The book pretends to be the biography by a little-known, but highly-influential guy named Licklider, but is in fact maybe the best general history of computing. It covers Turing, von Neumann, ARPA, Multics, DARPA (the internet), and Xerox PARC. Alan Key recommends it as the best history of PARC.


I'm always amazed how well Brian Kernighan can explain things. I love the episodes on the Computerphile youtube channel with him.

Recently I discovered the AT&T history channel, with this gem: https://www.youtube.com/watch?v=tc4ROCJYbm0

There is a massive difference in appearance and clarity between him and the other people appearing in that video, even the "presenter"...


> Recently I discovered the AT&T history channel, with this gem: https://www.youtube.com/watch?v=tc4ROCJYbm0

I go back to this video every once in a while ever since I discovered it a few years ago. I just think it is super relaxing. When I first watched it, I was beginning to use Linux and, when I opened my terminal emulator, I was like: "It's a Unix system, I know this!" The pipelines explanation was incredibly clear.

I also love the Computerphile episodes with professor Kernighan.


Regarding availability of an ebook version, I just wrote to bwk and got back a reply within a few minutes:

-----Original Message----- From: Brian Kernighan <bwk@cs.princeton.edu> To: Mike Russo <mike@papersolve.com> Subject: Re: please publish ebook of Unix memoir!! Date: Mon, 28 Oct 2019 11:12:48 -0400

Mike --

I have just uploaded a Kindle version, but it has to go through Amazon's approval, which could take a day. I also can't see a preview on a physical device, so I have no idea whether it will actually look right. If it doesn't, I'll have to pull it and figure out an alternative.

Brian K

On Mon, 28 Oct 2019, Mike Russo wrote:

if you get an ebook of it out there it will sell even more!! and thanks for writing it!

--

Michael Russo, Systems Engineer PaperSolve, Inc. 268 Watchogue Road Staten Island, NY 10314 Your random quote for today: ..you could spend all day customizing the title bar. Believe me. I speak from experience. -- Matt Welsh


It's now available. Big thanks to BWK for doing this.

Australia: A$11.99 [my local site] https://www.amazon.com.au/dp/B07ZQHX3R1/

or US$8.20 https://www.amazon.com/dp/B07ZQHX3R1/

vs printed book US$18 https://www.amazon.com/dp/1695978552


I look forward to reading this

I own and have read his following books and they were all superb!

   The Go Programming Language
   The Practice of Programming
   The C Programming Language
   The AWK Programming Language


also 'the Unix programming environment'


And The Elements of Programming Style.


Just out of curiosity, why awk? I've only ever used it for simple text splitting and didn't really know people did more with it. Is it a tool worth learning?


To really appreciate the need for awk, imagine writing one-liners and scripts in the late 80s where Perl or Python weren't present. The associative arrays in awk were a game changer. Of course, today there is no need to use awk for multi-line, complex scripts because Python or Perl does the job better (and both languages are more scalable). However, awk is still quite useful for one-liners. But for those developers who never use the one-liner paradigm of pipelines on the command line, this is something they don't realize they're missing.

Brian Kernighan mentions in the book that awk provides "the most bang for the programming buck of any language--one can learn much of it in 5 or 10 minutes, and typical programs are only a few lines long" [p. 116, UNIX: A History and Memoir]. Also keep in mind Larry Wall's (inventor of Perl) famous quote/signature line: "I still say awk '{print $1}' a lot."

More background on awk from Brian Kernighan in a 2015 talk on language design: https://youtu.be/Sg4U4r_AgJU?t=19m45s


Personally, I don't think that it's worth learning if you don't already know it, but the awk book is really an enjoyable read, not particularly wrong, and will show what you can do with the language so you can make a more informed decision whether to learn it.

In a way, awk, like sed, cut(1), paste(1), comm(1) is closer to the original Unix design philosophy of pipeline composition of simple, single function tools. cut, paste, and comm are simple, single function. sed is programmable, and awk even more so, but they're still optimized for being used in pipelines. Perl still has lots of options designed for pipeline usage (-p, -n, -a, -0, -F), but was always capable of doing general purpose programming, and python is primarily general purpose programming.

Personally, I've never learned sed, cut, paste, or comm; the only traditional text processing tools I use are sort & uniq. I know awk, but the one time I wanted to use it in the last 5 years in an official script, I was shouted down by confused youngsters. I use perl for one liners, and python for any nontrivial scripting.


It's a tool. If you ever need to slice and dice text, AWK is great. Sure, perl is great too, but I can never remember perl and stopped writing perl. I rarely use AWK by itself, it's usually with a mixture of other unix tools.

I once wrote code with curl, awk, sed, jq, xargs and in less than 30 lines implemented something that would have taken about 200 lines of go code. I did it too in less than an hour when I was told it would take a day. It was urgent and I needed the result fast.

There are different kind of hammers, different kind of nails and screws. Know your tools, use them when it's called for. You can use a wrench as a hammer, doesn't make it a great one.


Awk is great for data pipelines, not just oneliners.

For a long while I maintained a program that took the compiler output (intermediate form) from one toolchain, and converted it to another (intermediate form).

Why awk? It was faster, and more readable, than any of the alternatives, whilst being much much shorter.


Besides its obvious utility and generality (many tasks in computer science can be effectively solved by "simple text splitting" when the data is a few GB, and your data is often a few GB), it is worth learning awk just because the awk book is a masterpiece.


It's a question you have to answer yourself. I struggle to see how awk would fit into my work flow, personally, and it's not feasible to learn every cool tool. And you'd have to use it somewhat frequently or you start forgetting the syntax, unless you make Anki cards


awk is interesting because it was an early programming language (not just text processing) which made a number of convenient data structures available to people who could not program C.



https://disqus.com/home/discussion/tomwoods/ep_1186_lightnin...

Youtube-dl and awk to grep out of a YouTube channel.

If I were doing it again I would use youtube-dl -o to just print the base64, instead of "manually" renaming it to that with ls>awk:mv>sh.


I read this one on Saturday (bought it from Amazon after I saw it posted here earlier in the week). It's very good at detailing how UNIX was developed in the early days and how it exploded after 1979 with V7 - and in a way that isn't difficult to read whatsoever. There are some sections about the inner workings of UNIX I already knew - but skimming through those allowed me to catch a few historical gems I didn't know.


The ‘Customers who bought this item also bought’ on Amazon is pretty telling: the Snowden book, a Yubikey, an electronics testing tool, coolers for Raspberry Pi, retractable Ethernet cable, wire-type soldering iron tip cleaner (what even is that), and sandalwood shaving cream.


What I saw on the first two pages of Customers who bought.... list was:

Database Internals, Snowden book, Algorithms book, BPF Performance tools, Your Linux Toolbox, The Go Programming Language, The Pragmatic Programmer, Quantum Computing, A Programmer's Introduction to Mathematics, An Elegant Puzzle

I had to go up to page 14 of the items list to find all of the items you listed and to find the wire-type soldering iron tip cleaner.

Edit: Took out unneeded snark. It seems I fell prey to Amazon algorithms.


> Isn't this a bit of a cherry picked list?

I only see these items plus another cooler and a Macbook stand.


We've been analyzed and found wanting. Dang marketing algorithms.


> wire-type soldering iron tip cleaner (what even is that),

It's basically a fancy version of those metal pot scourers for cleaning the burnt bits off, but without damaging the metal plating on the tip.


Ah. Alas the only thing that I needed to clean off the iron is solder that doesn't ever want to leave the tip and go on the contacts.


The trick to soldering is you're not trying to put solder on things from the the iron. You heat up the joint with the iron, and then feed solder into the joint.

You put a bit of solder on the tip, but that's just to help apply heat to the joint.


Yeah, I've once watched a short clip of someone properly applying solder and realized I've been doing it wrong the whole time. I keep meaning to watch some educational material on the matter, but for now I've conceded that it's just some kind of higher magic.


Solder flows like water, it "wants" to stick to absorbent material via capillary action. The only difference is the material needs to be clean and free from oxidization so it has a direct metal contact (that's why solder has a flux core), and it needs to be hot. If you take care of that, the solder will pretty much do what you want it to.


The thing you want solder ON has to be hot, also.


Anyone able to comment on his recent non-programming-language books? For example, how's "Understanding The Digital World"?

http://kernighan.com/udw.html


Was just about to comment that in lieu of buying an ebook version of the currently discussed Unix book, I discovered he had written "Understanding the Digital World" and bought it on impulse. Currently going through the first chapter now. Like all of his books, extremely well-written. I was surprised at how much the introduction focused on data sharing and the Snowden-revelations, which he segues into after talking about how he and his wife weren't using Airbnb, Uber, and Whatsapp, on a recent vacation trip.


They are great. I gifted them to my non-tech friends and they have thoroughly enjoyed them and learned something.


I bought the book, and read it, enjoying it greatly.

Amusingly, Alexa notified me when it arrived, and the notification was "Your purchase UNIX has arrived."


However, you had also ordered a book about castrati singers, so the announcement was ambiguous.


Is there a kindle/digital/ebook version of this book? I can't seem to find it.


This is very frustrating. From what I understand of the publishing industry, the Kindle sales are not counted towards the main NYT best-seller list, so all the focus on the initial launch is around selling the physical copy. This is said to be done to prevent gamification and manipulation, but this still goes on with the hard copies also, where would-be best-selling authors often pay third-party companies to buy a lot of copies of their books from certain bookstores.

Source: A friend of mine who has been trying to get onto the NYT best seller list.


Ebook sales are counted as part of the NYT bestsellers list. In 2010, they initially did separate ebook listings, but it’s all combined into the general categories: https://www.vox.com/culture/2017/9/13/16257084/bestseller-li...

I can’t remember a single mainstream title that hasn’t had an ebook version for preorder and first day sales. This includes “Bad Blood”, “Super Pumped”, and “Catch and Kill”.


Not downplaying your valid point, but I doubt a guy like bwk cares about NYT best selling list at all. It’s the history of Unix, which to us folks are basically the Old Testament, props to him for writing it. I look forward to having a physical copy of it and praying to it occasionally


I don’t see one either, which puzzles me. I only read on paper if there is no other alternative. This seems like exactly the kind of book I would hope to read in digital form, but I'm not finding it.


The irony of using Kindle Direct publishing but not putting out a Kindle version is palpable :-)

I have stopped buying paper books over past several years as they simply accumulate moisture and mould in the climate I live in and make rooms smell musty. It is incomprehensible to me why primarily textual books would be published without and ebook version these days.


Oh it's vanity press? That explains the...cover. You'd think Kernighan would have no shortage of technical presses willing to put something out on whatever terms he dictated.


The joy of Kindle publishing is that you keep all the money ;-)

This is already #243 out of all Amazon, so I'd say he was correct in this ...


nice! but what's holding me back also is the lack of a digital version, otherwise i'd already have bought it.

edit: he replied to me to say he's uploaded a version today that has to be approved.


It's better to call it self-publishing, not vanity press (a gatekeeping term).


I haven't received my copy just yet, but I am keen to get reading it. Although there is some fun conversation to be had with people from a non-tech background when you're excited for your copy of Eunuchs: A History and a Memoir to arrive.


Watching Brian Kernighan and Professor Brailsford on YT Computerfile is pure pleasure to hear and see how world look from their perspective and quite often allows to use the same principles today. We believe so much changed in his industry but some foundations are the same like 50 years ago :)


correction. Isn't it computerphile?


I've seen a lot of Brian Kernighan on the YouTube Computerphile channel and he's very well spoken and interesting to listen to. Might have to pick this up.


Looking forward to reading this! Had the pleasure of seeing Brian Kernighan and Ken Thompson speak at VCF East earlier this year, totally worth a watch: https://www.youtube.com/watch?v=EY6q5dv_B-o

I highly recommend hitting up a Vintage Computer Festival if you've got the opportunity (http://vcfed.org/wp/festivals/). Not only did Kernighan and Thompson speak, but also Joe Decuir of Atari & Amiga fame.


I agree with you on that we don't tend to appreciate history and attempt to be aware of how long things have been around. I'm looking forward to read this too!

On that note, this post reminds me of another great book (eye-opener of sorts) that I read a while ago that takes the reader through the history of some important milestones in hardware and software https://www.amazon.com/Computer-Book-Artificial-Intelligence...

After I read this, I was surprised to learn that the concept/ideas which seem very recent because of the interest/research around it have actually been around for a long time. For instance, I learnt that "Secure Multi Party Communication" has been around since 1982, Verilog since 1984, AI Medical diagnosis since 1975.


The link says “Published by Kindle direct publishing” but I can’t find the kindle version on amazon. Or indeed any ebook version. Do Kindle books not show up for pre order usually?


CreateSpace (paperbacks) is now part of Kindle, so you can release a paperback with no ebook on Kindle.


There doesn't appear to be a kindle version. There's a link to request one from the publisher on the Amazon page though.


I find a lot of books on subjects I'm interested in are quite dry. Is this a departure from that? I'd like something I'd enjoy, and something I can buy as a gift for some family members who enjoy nonfiction on topics they aren't necessarily versed in.


As others have noticed, there is no Kindle version. Look Inside is enabled so you can start reading in-browser: https://www.amazon.com/gp/reader/1695978552/ref=as_li_ss_tl?...

While the story/writing is interesting, like most self-published titles this book needs some professional layout/editing help. The second and sixth pages of chapter 1 are full page Google Maps of central NJ and Google Satellite view of Bell Labs. Eeek.

This may also be why there is no Kindle version yet. Many pages have full-color images and would need significant changes for a decent Kindle reading experience.


For those interested in the subject I can recommend the very readable "A Quarter Century of Unix" by Peter Salus.


Amazon Best Sellers Rank: #243 in Books

like, that's all books on Amazon US


Purchased right away!


You'd think he would have found someone to design a nicer cover for him.


You have no idea. I received my copy a few days ago. The whole cover is a low-resolution bitmap for both the image and the text.

The book is a great read otherwise!


Seriously, I thought I received a knock-off when I first opened the package.


A utilitarian hack seems apt for the subject material.


Probably not true. But it's fun to imagine that even Brian Kernighan were not able to find a publisher, and needed to resort to self-publishing.


I can report that this is not true in this case - Brian wrote his book over the summer and, I think, wanted to get it out for the Unix 50 event at Bell Labs this past week.


Currently #1 in "Best Sellers in Computing Industry History" on amazon, which is somewhat of a weird category :)


It's almost as if the more "best seller" lists there are, the more authors will make efforts to get onto them :)


Any book is a bestseller in some category.


old is gold fit here


The book is brand-new.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: