
Donald Knuth was framed - akalin
https://buttondown.email/hillelwayne/archive/donald-knuth-was-framed/
======
svat
I'm glad to see this here, for two reasons: (1) In general it's nice when
people return to the primary sources rather than second-hand accounts, and (2)
this particular topic is of interest to me; here are a couple of previous
comments that were somewhat well-received on HN:

[https://news.ycombinator.com/item?id=22221592](https://news.ycombinator.com/item?id=22221592)

[https://news.ycombinator.com/item?id=18699718](https://news.ycombinator.com/item?id=18699718)

For further context you can look at past and future issues of Bentley's column
(and its spinoff); a list of them I collected here:
[https://shreevatsa.net/post/programming-
pearls/](https://shreevatsa.net/post/programming-pearls/)

I guess it's a long-standing tradition in literary reviews for reviewers to
push their own ideas, rather than confining themselves solely to reviewing the
work in question. That is what happened here. Knuth had written a program that
he had been asked to write, to demonstrate the programming discipline. But
McIlroy, as the inventor of Unix pipes and a representative of the Unix
philosophy (at that time not well-known outside the few Unix strongholds: Bell
Labs, Berkeley, etc), decided to point out (in addition to a good review of
the program itself) the Unix idea that such special-purpose programs shouldn't
be written in the first place; instead one must first accumulate a bunch of
useful programs (such as those provided by Unix), with ways of composing them
(such as Unix pipes). A while later, John Gilbert described this episode this
way:

> _Architecture may be a better metaphor than writing for an endeavor that
> closely mixes art, science, craft, and engineering. “Put up a house on a
> creek near a waterfall,” we say, and look at what each artisan does: The
> artist, Frank Lloyd Wright (or Don Knuth), designs Fallingwater, a building
> beautiful within its setting, comfortable as a dwelling, audacious in
> technique, and masterful in execution. Doug McIlroy, consummate engineer,
> disdains to practice architecture at all on such a pedestrian task; he hauls
> in the pieces of a prefabricated house and has the roof up that afternoon.
> (After all, his firm makes the best prefabs in the business.)_

There are other points (not mentioned in this article), e.g. the fact that
_someone_ had to have written those Unix programs in the first place and
writing them with literate programming can lead to better results, and the
fact that Knuth's idea of using a trie (though not a packed/hash trie; that's
no longer needed) _still_ seems fastest:
[https://codegolf.stackexchange.com/questions/188133/bentleys...](https://codegolf.stackexchange.com/questions/188133/bentleys-
coding-challenge-k-most-frequent-words) (please someone prove me wrong; I'd
love to learn!)

Knuth gladly included McIlroy's review verbatim when he reprinted this paper
in his collection _Literate Programming_. BTW here's an 1989 interview of
McIlroy
[https://www.princeton.edu/~hos/mike/transcripts/mcilroy.htm](https://www.princeton.edu/~hos/mike/transcripts/mcilroy.htm)
where he looks back and calls Knuth's WEB “a beautiful idea” and “Really
elegant”, and his review “a little unfair”, though of course he reiterates
_his_ main point.

~~~
svat
Counterpoint[0]:

Knuth really _is_ a fan of writing monolithic (rather than "modular") programs
from scratch, in a way that goes against all the experience of software
engineering accumulated over decades, so that criticism is well-deserved.

For example, his big programs TeX (1982) and METAFONT (1984) are each book-
length and the source code of each is in a single large file amounting to
about 20000+ lines of Pascal code. His programs do not contain much in the way
of standard software-engineering practices like abstraction, modules (hiding
implementation behind an interface), unit tests, libraries, etc. In fact, he
has spoken out against unit tests and code reuse! [1]

> _the idea of immediate compilation and "unit tests" appeals to me only
> rarely, when I’m feeling my way in a totally unknown environment and need
> feedback about what works and what doesn’t. Otherwise, lots of time is
> wasted on activities that I simply never need to perform or even think
> about. Nothing needs to be "mocked up."_ ...

> _With the caveat that there’s no reason anybody should care about the
> opinions of a computer scientist /mathematician like me regarding software
> development, [...] I also must confess to a strong bias against the fashion
> for reusable code. To me, "re-editable code" is much, much better than an
> untouchable black box or toolkit. I could go on and on about this. If you’re
> totally convinced that reusable code is wonderful, I probably won’t be able
> to sway you anyway, but you’ll never convince me that reusable code isn’t
> mostly a menace._

Moreover, his sympathies always lay with the "other" side of the "structured
programming" revolution (he still liberally uses GOTOs, etc -- still coding
like a 1950s/1960s machine code programmer), and in his 1974 paper "Structured
Programming With Go To Statements", he approvingly quotes something that might
horrify many software engineers today:

> In this regard I would like to quote some observations made recently by
> Pierre-Arnoul de Marneffe:

> _In civil engineering design, it is presently a mandatory concept known as
> the "Shanley Design Criterion" to collect several functions into one part .
> . . If you make a cross-section of, for instance, the German V-2, you find
> external skin, structural rods, tank wall, etc. If you cut across the
> Saturn-B moon rocket, you find only an external skin which is at the same
> time a structural component and the tank wall. Rocketry engineers have used
> the "Shanley Principle" thoroughly when they use the fuel pressure inside
> the tank to improve the rigidity of the external skin! . . . People can
> argue that structured programs, even if they work correctly, will look like
> laboratory prototypes where you can discern all the individual components,
> but which are not daily usable. Building "integrated" products is an
> engineering principle as valuable as structuring the design process._

> ... Engineering has two phases, structuring and integration: we ought not to
> forget either one...

(This comment is slightly tongue-in-cheek, but hopefully provocative enough.)

[0]: Hey it's been a couple of hours and there's no reply attacking my
comment, guess I better do it myself. :-)

[1]:
[http://www.informit.com/articles/article.aspx?p=1193856](http://www.informit.com/articles/article.aspx?p=1193856)

~~~
hyperpallium
Modularity incurs a complexity tax. If you are smart enough to keep it all in
your head, like a Motie Engineer, you can simplify by omitting it. But if
you're _not_...

~~~
mcv
That's the thing here. Knuth can easily juggle far more complexity in his head
than the average programmer, and that's fine for software that only he has to
maintain. But when something needs to be maintainable by average programmers,
you need to write for them and avoid that complexity.

~~~
squiggleblaz
Who's writing for the average programmers? It's the average programmers. How
can average programmers write code that is sufficiently abstract that it hides
the complexity they can't handle? How can they change that code when it
doesn't work? Does every company of 5 devs need to have a PhD?

~~~
hyperpallium
> How can average programmers write code that is sufficiently abstract that it
> hides the complexity they can't handle?

Well said.

Hiding complexity is harder than handling it. Designing an effective way to
expose it for use amounts to (part of) a "programming system" [Brooks].

So it's library writers who hide the complexity for average programmers, with
a "programming product": stdlib, open source project or (rarer these days)
commercial "engine".

When there is no such library, we get the current state of our art: many
projects failing.

------
btilly
There are many, many ways to tell this story. However it resonates because it
fits well with an ongoing dynamic. Which is that programmers naturally want to
build perfect tools while the business need tends to be for a quick and dirty
solution that can be assembled out of prebuilt pieces.

Except that this one is extreme. It is hard to find any programmer who builds
a more perfect tool than Knuth. There isn't a better known problem where the
discrepancy between the perfect and the good is this dramatic. Therefore this
became the poster child for this ongoing type of conflict retold by those who
see themselves as on the side of the quick and dirty solution built out of
preassembled pieces. Who, of course, slant the story to further fit their
prejudices.

This kind of slanting is common. Take a famous WW II example where they were
looking to improve bomber durability by putting more armor plate where the
bombers were coming back hit. A wise statistician advised them that those were
the spots that didn't need armor, they should put it on where the bombers were
hit and _didn 't_ come back. Which, assuming that bombers were hit evenly
everywhere, was all the places that they didn't find holes.

The reasoning is perfect, and illustrates a key point of statistical wisdom.
But the retelling of the story almost never admits that the advice didn't
really make a difference. The actual solution to the rate at which bombers
were being destroyed was to drop chaff - strips of aluminum cut to half the
length of the German radar - so that the radar systems got overwhelmed and the
Germans couldn't find the bombers.

~~~
Zimahl
Another example people use for the KISS principle is of the 'space pen', where
the US spent thousands/millions developing a pen that works in space whereas
the Soviets used a pencil. However, almost none of the story is true.
Regardless, it gets repeated over and over.

~~~
avip
We'll be living in a pretty dull place if only true stories were allowed.

~~~
Jon_Lowtek
Bashir: "Of all the stories you told me, which ones were true and which ones
weren't?"

Garak: "My doctor, they're all true."

Bashir: "Even the lies?"

Garak: "Especially the lies."

------
bonyt
Site is down, but it looks like archive.org got it.[1] But the entire page is
blank, because javascript. It has a fallback, which has what looks like the
article in a noscript tag.

Hastily converted to markdown with pandoc, and put up as a github gist:
[https://gist.github.com/tonyb486/eec2f16b06eef4692dac6e56362...](https://gist.github.com/tonyb486/eec2f16b06eef4692dac6e56362cac50)

[1]:
[https://web.archive.org/web/*/https://buttondown.email/hille...](https://web.archive.org/web/*/https://buttondown.email/hillelwayne/archive/donald-
knuth-was-framed/)

~~~
segfaultbuserr
> _It looks like archive.org got it. But the entire page is blank, because
> JavaScript._

The AJAXification of the web, even for trivial web pages, is defeating
archive.org on multiple occasions, it's a huge threat to our history
preservation. We cannot stop people from overusing AJAX, and we need to
develop better archival tools.

~~~
geofft
This is an equivalent problem to the web search problem, right? I believe
Google indexes webpages by visiting them in headless Chrome and dumping the
resulting DOM - it seems like archive.org could do the same.

[https://searchengineland.com/google-will-ensure-googlebot-
ru...](https://searchengineland.com/google-will-ensure-googlebot-runs-the-
latest-version-of-chromium-316534)

(Also archive.today solves this somehow, no?)

~~~
userbinator
Google has orders of magnitude more computing power and humans to throw at the
problem.

------
geofft
Something related that's been on my mind recently: a lot of the advantage in
building well-designed programs (documented, modular, built to be tested, free
of smells, etc.) is less about getting the program right than about getting
_future changes_ right, either by someone else, or even yourself when you've
forgotten how to do it. But future changes only matter _if you want to use the
existing program as a starting point_.

McIlroy's pipeline is a little bit hard to read, but I would bet that most
people with moderate experience in building shell pipelines could rebuild
something equivalent from scratch, even if they'd have trouble explaining how
the current pipeline works. (Or people with experience in Python, Perl, etc.
could throw together an equivalent script from scratch quickly.)

An implication is that, if you're in a language where you can write a
productive program to do a task from scratch within (say) 30 minutes, there's
a lot less of a reason to think about good programming design than if you're
in a language where doing that same task from scratch is likely to take you a
day. In the second language, most of the value of writing documented and well-
structured code is so that it takes you 30 minutes when someone asks you to
modify it. But in the first language, you can throw away your code when you're
done with it and still solve the modified problem within 30 minutes.

Another possible implication: it's better to build reusable components
(libraries and utilities) than easily-modifiable code. Part of why McIlroy's
pipeline works so well is that tools like "tr" and "uniq" exist - and most of
us will never have reason to modify the source code of those tools. We need to
know what their interfaces are, but we don't need to know their internals.

~~~
jackweirdy
This comment reminds me of something of a thought-experiment-come-hacky-side-
project that I can’t seem to find.

The premise was, what if you could never edit a function? Instead, you had to
delete it and recreate it entirely whenever changes were needed.

The incentives are twofold - keep functions small, so you don’t lose too much
investment if you have to delete code, and think before you start writing,
else you waste time.

As I remember, this project came with an AST manipulator that would helpfully
delete your function if it’s unit tests failed.

~~~
fasquoika
[https://github.com/munificent/vigil](https://github.com/munificent/vigil)

~~~
jackweirdy
Yes! This is it

------
ColinWright
It seems to have the "HN Hug of Death" ... here's a snapshot summary from
memory:

Knuth and McIlroy gave examples of a word count program. Knuth used Literate
Programming and did it in 8 pages, McIlroy used shell utilities and did it in
8 lines.

This post goes back to the original paper and discovers that Knuth has been
grossly misrepresented as to what he was trying to do, and what he achieved.

 _Edit_

bonyt's[0] comment[1] has links to copies of the content.

[0]
[https://news.ycombinator.com/user?id=bonyt](https://news.ycombinator.com/user?id=bonyt)

[1]
[https://news.ycombinator.com/item?id=22406365](https://news.ycombinator.com/item?id=22406365)

~~~
hprotagonist
so a rebuttal/companion to [http://www.leancrew.com/all-this/2011/12/more-
shell-less-egg...](http://www.leancrew.com/all-this/2011/12/more-shell-less-
egg/)

~~~
pdpi
Pretty much, yes.

The argument, in short, is that the problem Knuth is solving isn't "find K
most common words", but rather "Use the K most common words problem as the
basis to demonstrate how you use Literate Programming". Knuth actually tackles
the latter, McIlroy's rebuttal is just the former, so doesn't actually serve
as much of an argument about anything.

~~~
lonelappde
There's nothing surprising or underhanded here, and never was. It was just too
complimentary perspectives.

McIrloy's paper shows the value of good hackery over engineering.

Knuth was set back by the blogger's dilemma of trying to show a technique that
maybe useful in general but isn't very useful in a very short program.

~~~
pdpi
I didn't mean to imply there was anything underhanded about McIlroy's
critique. Just that that, because McIlroy's code isn't trying to solve the
same problem, it shouldn't be taken as a rebuttal to Knuth's point, which is
what, apparently, many people believe it to be.

~~~
projektfu
There's lots of interesting facets here. Doug's solution requires a working
Unix system. You might not have that in the problem space.

On the other hand, his solution shows that by assuming some nice things about
your data, things which Pascal does not allow, you can do some useful things
with terse programming. This is reminiscent of the array programming model of
APL or J. I haven't written a solution but I'd be surprised if it is more than
about 15 characters.

In the shell case, the terse program is like a Chinese classic. Written in its
own jargon requiring basically a prior understanding of the program to read
it.

~~~
acqq
> In the shell case, the terse program is like a Chinese classic. Written in
> its own jargon requiring basically a prior understanding of the program to
> read it.

However, if Knuth wanted to be terse, he could have been and write a Pascal
program which uses his library routines without showing them ! -- in effect
using Unix solution by Doug is equivalent to "not only you must use already
pre-written libraries in order to be terse, they are even not the plain
routines but they are instead packed as the whole executables, and to use
these executables you have to use them on exactly this operating system,
exactly using this specific shell, all that property of AT&T (at that time
Linux didn't exist, and who had rights to use what wasn't clear, unless you've
bought something expensive) just to be able to call these library routines.

So it can't be considered a serious "critique." At that time, under these
circumstances, it was just an unfair ad.

------
rstuart4133
I was around when that encounter happened. I still recall my feelings when I
read it.

I don't recall feeling Knuth was framed.

Although McIlroy's solution didn't really address the question Knuth was
asked, it was cute and demonstrated the power of this newfangled thing called
"Unix Philosophy", which at the time needed some oxygen. However the words
McIlroy accompanied it with were simply unnecessary, almost childish. I recall
cringing when I read them.

I do remember wondering if Bentley had done the right thing in publishing
McIlroy's raw comments, but decided the main attraction of his column was a
refreshing honesty. He presented his pearls without any the usual breathless
hype or artificial conflict a journalist might add to spice up interest.
Having set up this experiment, it would not have been "Programming Pearls" if
he didn't report exactly what happened, as it happened.

Bentely's style wasn't to constrain or direct for a particular outcome he
wanted. That meant McIlroy had lots of rope and McIlroy used it to hang
himself, that was McIlroy's problem.

------
smsm42
I'm not sure how it's a fair comparison between ready-made Unix tools and
written-from-scratch solution. I mean it's like somebody asked me to implement
a C compiler, and when I produced a huge pile of source code would tell me
"Idiot, I can do all that by just typing "gcc"! Muah-hah-ha!"

~~~
dllthomas
In my second compiler design course, we were told we could use any language. I
(jokingly) asked whether we could choose shell, and consider a C compiler a
feature of the language. I was (in the same spirit) informed that it would be
cheating.

~~~
smsm42
Some languages do include self-compilers as part of the runtime... So probably
"any" wasn't meant really "any".

~~~
dllthomas
The target language was (a subset of) C, so access to a lisp compiler wasn't
going to help in quite the same way.

------
persona
Title is a little (?) clickbait... Promoting a YOW talk:
[https://www.youtube.com/watch?v=ATobswwFwQA](https://www.youtube.com/watch?v=ATobswwFwQA)

\-- The other day I was talking with a friend about structured editing and
literate programming came up. LP was one of Donald Knuth's ideas, to structure
programs as readable documents instead of just machine docs. He was interested
in it, I was cautiously skeptical. We both knew the famous story about it:
[https://en.wikipedia.org/wiki/Literate_programming](https://en.wikipedia.org/wiki/Literate_programming)

"In 1986, Jon Bentley asked Knuth to demonstrate the concept of literate
programming by writing a program in WEB. Knuth came up with an 8-pages long
monolithic listing that was published together with a critique by Douglas
McIlroy of Bell Labs. McIlroy praised intricacy of Knuth's solution, his
choice of a data structure (Frank M. Liang's hash trie), but noted that more
practical, much faster to implement, debug and modify solution of the problem
takes only six lines of shell script by reusing standard Unix utilities.
McIlroy concluded: >>Knuth has shown us here how to program intelligibly, but
not wisely. I buy the discipline. I do not buy the result. He has fashioned a
sort of industrial-strength Faberge egg—intricate, wonderfully worked, refined
beyond all ordinary desires, a museum piece from the start."

The program was print out the top K most-used words in a text. (and so it goes
on...) \---

~~~
hwayne
To clarify, it was an email newsletter. The YOW! thing had just gotten
published so I got that out of the way before diving into the meat of the
newsletter post, which was about LP.

------
paulrpotts
Good analysis. Didn't Knuth famously say that his job was to get to the bottom
of things, not to stay on top of things?

If I were writing a one-off program to do this once for a paid project, the
shell script is absolutely the way I would go about it.

If I were writing it as a computer scientist, accustomed to teaching students
how to find optimal solutions, something like the Knuth program is absolutely
the way I would go about it (although in 2020, I would likely use C, not
Pascal). I also would likely roll my own approach if I was writing it for the
kind of target machine I work on now - a very small one (an embedded
microcontroller). And Knuth made his bones when computers were (physically)
huge but (in memory and speed) tiny.

------
kazinator
The shell script uses utilities that are written in C, probably totaling way
more lines of system programming language code than Knuth's Pascal solution.

It's a pretty unfair comparison, since the literate programming solution was
not presented in order to show a code-golfed word histogram, but how to
annotate code.

The has trie structure was included in it for that purpose, to show how you
use literate programming to annotate such a thing.

> _First of all, we found that Literate Programming as conceived by Knuth not
> just “text, code, text, code”._

Unfortunately, it's something worse! It's a system in which you chop up a
program into arbitrary sections, and give these macro-like names. The sections
are presented in some aribitrary order and hierarchically combined together
using those macro-like names.

Like, say:

 _The program 's overall structure is to accept input, performs some
processing and produce output:_

    
    
       accept_input
    
       do_processing
    
       produce_output
    

The divisions don't follow function boundaries. A specific loop inside a
specific function might have its own section, and the variable declarations
above their own.

It's basically horrible. You can't see the code all in one piece at all when
editing it. It won't play with your existing tools. The generated result won't
even have proper indentation.

Web and CWeb are programs geared toward someone who is an academic, mainly
interested in writing a paper that revolves around a small amount of code.

What you're really writing is a paper, such that both the typeset paper (with
nicely typeset code), and accompanying code pops out from the same source. The
raison d'etre is that paper though.

You would be suicidal to use this to write an actual application that doesn't
need to be detailed in an academic paper.

Knuth did somewhat take it to those extremes, but working alone.

If we look at TeX, the bulk of it consists of a single tex.web file which is
simultaneously not such a large volume of work (less than 25,000 lines of code
of documentation and data, less than a 1024K megabyte) ... yet too large to be
all in a single file.

~~~
svat
I love this comment, and probably wouldn't even disagree with "It's basically
horrible" :), but just to point out a few things:

\- "The generated result won't even have proper indentation." Actually, what
you see in the typeset output generated by weave/cweave is what Knuth
considers proper indentation, and he has written paeans multiple times to
Myrtle Kellington (executive editor for ACM publications) who developed that
style, etc. There are a lot of lines in WEAVE devoted to getting the
indentation exactly so. I personally find it hard to read as well (as I
imagine do most programmers), and both McIlroy in his review ( _" Second,
small assignment statements are grouped several to the line with no
particularly clear rationale. This convention saves space; but the groupings
impose a false and distracting phrasing..."_) and Harold Thimbleby in his Cweb
article mention these departures from what C (etc) programmers are used to.

\- _" Web and CWeb are programs geared toward someone who is an academic,
mainly interested in writing a paper that revolves around a small amount of
code."_ I would disagree: yes they are geared towards someone who is an
academic -- specifically Knuth -- but from everything he's said, his love of
LP is about the programs themselves, not papers about them. (Look at
[https://cs.stanford.edu/~knuth/programs.html](https://cs.stanford.edu/~knuth/programs.html)
for some of his programs; he says he writes several programs a week and keeps
most of them to himself; the ones published online before Sep 2017 I had
typeset here: [https://github.com/shreevatsa/knuth-literate-
programs](https://github.com/shreevatsa/knuth-literate-programs) \-- I ought
to clean up and refresh that stuff.)

\- Finally, WEB arose out of certain specific constraints. After he had
written the original version of TeX in SAIL for his personal use, it turned
out there was widespread demand for it, and people at other places had started
porting it into their local systems/languages (with risk of
incompatible/irreproducible implementations). For this he decided to embark on
a two-year rewrite into the language that was available at the most number of
university computer systems: Pascal. This language had been designed primarily
for teaching, and at this time there wasn't even a Pascal standard by that
name -- every compiler did its own thing. So he was targeting the "common
denominator" of Pascal compilers, which meant no separate compilation units to
be linked in; everything _had_ to be in a single file at least as seen by the
compiler. (In fact his TeX78 in SAIL had been written as several separate
files in a more conventional (to us) style.) And yeah, the fact that he had
been requested to eventually publish the source code of TeX (which he did,
Volume B of _Computers and Typesetting_ ) played a part.

BTW the author/maintainer of Axiom
([https://en.wikipedia.org/wiki/Axiom_(computer_algebra_system...](https://en.wikipedia.org/wiki/Axiom_\(computer_algebra_system\)))
has been writing it for years in a literate programming style, and whether
that is insane/suicidal is beyond my ability to judge at the present. :-)

------
brisky
I like the idea that code should look more and more like human language. I
think that programming languages have gotten much more human-friendly nowadays
and this trend will continue. I have created my own human-like programming
language prototype to explore these ideas and have implemented TodoMVC with
it:
[https://github.com/tautvilas/lingu/blob/master/examples/todo...](https://github.com/tautvilas/lingu/blob/master/examples/todomvc/prog.lgu)

------
FabHK
Does anyone have any insight in the computational complexity of the two
solutions?

The shell script sorts all the words (and then all the counts), so has
complexity at least O(n log n) in the number of words.

It seems to me an optimal solution could be faster, and I wonder what Knuth
implemented.

~~~
terminaljunkid
If you consider hash table insertion as O(1) then it is O(n) I guess.

------
pkrumins

        KNUTH vs MCILROY
    

[https://comic.browserling.com/knuth-vs-
mcilroy.png](https://comic.browserling.com/knuth-vs-mcilroy.png)

    
    
         WHO WILL WIN?!

------
aortega
You cannot compare Knuth's software with a script that use pre compiled tools
that have thousands of lines.

If you want to compare both solutions, use the Kolmogorov complexity: compress
both programs together with any tool/compiler it uses and just then, compare
both sizes. I bet Knuth's solution has an order of magnitude less complexity.

------
ben509
These days, though, you can have it both ways with a Jupyter notebook or
something similar. It's literate, in that you can keep thoughts and discussion
in markdown cells, and you can neatly combine computations in computation
cells.

------
lalalandland
Unix shell scripting is an obtuse practice that is really hard to discover.
That short shell script is built on hours of hard learned Unix experience that
would probably be at least 8 pages long, if explained thorough.

------
shp0ngle
I will probably be downvoted for questioning the legends, but... I don't think
the poster child of Literate Programming - TeX itself - really has that
readable source code.

[https://mirror-
hk.koddos.net/CTAN/systems/knuth/dist/tex/tex...](https://mirror-
hk.koddos.net/CTAN/systems/knuth/dist/tex/tex.web)

~~~
svat
The .web file is not supposed to be read directly. You're supposed to read the
"woven" version: [https://texdoc.net/texmf-
dist/doc/generic/knuth/tex/tex.pdf](https://texdoc.net/texmf-
dist/doc/generic/knuth/tex/tex.pdf)

~~~
guitarbill
If you have to make changes to it, you'd have to read it directly, right?

~~~
svat
I think the idea is that if you want to make changes to an existing program,
you get up from your desk, go look at your shelves and pull out the sheaf of
paper for that program, read/study it, think about what changes are needed,
write down the code changes, then go to the computer and type it up -- only in
the last step you're looking at the .web file but only for mechanical
transcription from paper to computer.

(This is not a joke. Knuth still writes not just his papers/books but even his
programs by hand on paper first. And this how TeX was written, according to
the person other than Knuth who should know best:
[https://news.ycombinator.com/item?id=10172924](https://news.ycombinator.com/item?id=10172924))

~~~
db48x
Actually you would go to your bookshelf and get the beautifully typeset book,
rather than a sheaf of paper.

------
rswail
Maybe people should take into account that Knuth is a mathematician and
computer _scientist_ , not a software _engineer_.

His constraint is _correctness_ under all circumstances, an engineering
constraint is about _efficiency_ which constrains correctness to a subset of
the potential uses.

So from his perspective, the problem was something to be solved, not something
to be implemented. As such, describing the problem and the solution is much
more a literate requirement, being clear about it to the reader, than a
computing requirement, being clear to the computer.

Personally, I think most software "engineering" is a crock, built on false
assumptions and invalid data, subject to fads like "Agile" (or CMMI or SPICE
or RUP or...).

The software industry would be better focussed on architecture, not the naive
patterns of the GoF, but on fundamentals like types and data flows, Nouns vs
Verbs and taking the science and making it practical for use in day to day
development by incorporation into the tools used.

------
giancarlostoro
If anyone managed to catch it when it wasn't down, care to share a screenshot
of the article? I seem to be getting some Heroku error now.

> An error occurred in the application and your page could not be served. If
> you are the application owner, check your logs for details. You can do this
> from the Heroku CLI with the command

~~~
aWidebrant
Lightly edited: [https://rentry.co/zq4r6](https://rentry.co/zq4r6)

------
dllthomas
I often recommend reading this essay. The conception of LP described (and
embodied) is only tenuously related to modern systems claiming to be LP, and
what's presented is really interesting.

That said, I still find myself unsure how it translates into practice. Does
anyone have experience working on a system described this way? In particular,
I wonder what refactors feel like, and how general problems around stale
documentation apply (or fail to).

------
lincpa
Markdown Literary Programming that don't break the syntax of any programming
language

[https://github.com/linpengcheng/PurefunctionPipelineDataflow...](https://github.com/linpengcheng/PurefunctionPipelineDataflow/blob/master/doc/markdown_literary_programming.md)

------
dracodoc
I program mainly in R and I always use RMarkdown. I write extensively about
question definition, notes, reference, exploration, different directions in
RMarkdown. In the end if I ever need to have a script version I have some
utility function to pick code chunks and output as script.

This serve as a very good documentation and is much better than code comments.

------
mcv
Comparing Knuth's LP demo to a 6 line shell script is comparing apples to
oranges. There's no language that's going to be able to do that job in as few
lines as that shell script, but that doesn't mean all languages are useless.
They're both interesting demonstrations, but it's a useless comparison.

------
DSpinellis
The proposed example problem that would be difficult to solve on the Unix
command line can be written as a pipeline of just nine commands. See
[https://www.spinellis.gr/blog/20200225/](https://www.spinellis.gr/blog/20200225/)

------
jesse_m
I haven't heard of the leo editor. I have used org mode in Emacs for little
things. I also came across a reimplmentation of the tangling functionality:
[https://github.com/thblt/org-babel-tangle.py](https://github.com/thblt/org-
babel-tangle.py)

~~~
dannyobrien
It's worth playing with: it's primarily designed for writing literate Python,
and I greatly enjoyed working with it on a project as an experiment. I shuttle
between neovim, emacs and leo, and dream of an editor that would somehow bring
together all of their benefits. Oh, and maybe some smalltalk too!

------
hzhou321
> Writing the code out-of-order is the whole point.

Yes!

MyDef, [https://github.com/hzhou/MyDef](https://github.com/hzhou/MyDef), is
such a system.

------
gameswithgo
The unix command solution is likely orders of mangnitude slower than what
Knuth wrote, which given the computer power of the time should have seemed
fairly relevant.

------
adonese
It seems to be down.

~~~
mark-r
For me too. It's a shame given the inflammatory title - I really want to know
what it means.

~~~
Ari_Rahikkala
My guess is it's either about a picture of Donald Knuth, or Knuth's reward
checks, being put in picture frames.

~~~
ColinWright
Neither.

~~~
Ari_Rahikkala
Aw. Well, it was worth a shot, that kind of mild homonym abuse is kind of
direction that these sorts of titles usually tend to take. Fortunately the
real story is more interesting.

------
kick
Site's back up.

------
draw_down
I've seen similar defenses of Knuth over the years, but I'm still inclined to
view this McIlroy's way. I think focusing on the shell script is a canard, the
point is he picked what seemed like the right tool for the job. Isn't that
what commenters are saying here all the time?

Anyway, perhaps I just suffer from a failure of imagination, but I can't see
why the "Blah"s interspersed with `foo`s and `bar`s is meant to be revelatory.

~~~
CrazyStat
The job Knuth was asked to do was "write a piece illustrating literate
programming." A six line shell script is not the right tool for that job.

------
Myrmornis
> The actual paper paints LP in a much better light than we thought it
> did....The actual content is effectively his stream-of-consciousness notes
> about what he’s doing. He discusses why he makes the choices he does and
> what he’s thinking about as primary concerns. It’s easy to skim or deep-dive
> the code.

Now imagine if these stream-of-consciousness comments getting in the way of
the code were written by your colleagues, rather than Donald Knuth.

~~~
Jtsummers
Like all documentation, in real systems it would be edited over time. You do
develop documentation for your systems, right?

~~~
Myrmornis
My point is that people should be encouraged to keep code comments to a
minimum, and always strive to first rewrite the code so that it is clear
(well-chosen variable names, etc). LP does the opposite: it encourages
verbosity, and tangential discussion.

~~~
Jtsummers
My point is that literate programming isn't about _comments_. Comments are
asides, they aren't the core focus of what's being read, the code is. Literate
programming is about _documentation_ , which should be kept _concise_ , but
not necessarily _minimal_.

As to whether this leads to tangents or not, that's up to the authors to
determine how to organize. Appendices or rationale sections are a great way to
separate the core focus from what others may see as extraneous.

~~~
Myrmornis
Yes, sorry, I did understand that that was your point. I just feel very
strongly about the tendency, for people (especially new programmers) to be
told that it's important to comment code heavily.

Yes I've never seen a good solution for documentation in the teams I've worked
in on larger systems/codebases. I'm not really a believer in documentation
that evolves in a separate git repo (or god forbid, in Confluence) for the
obvious reason that it gets even more stale than documentation that evolves in
the same git repo. So that would put me in favour of your position here.

But, how can documentation (with appendices or rationale sections) be
interleaved with code without destroying the ability to read the code? I've
tried RWeave in R a long time ago, and I've even collaborated and published on
a literate programming tool, and I have always come back to the conclusion
that I want to read code with the minimum of intervening prose.

So my position is documentation should absolutely be written and maintained,
it should be in the same git repo as the code, it should be in a separate file
(not in comments or docstrings or automagically weaved sections using some
special syntax), it should be written tersely and without much personality,
and code reviewers should request documentation updates if a PR renders some
documentation stale or requires new documentation.

So I'm pro documentation, anti LP, and pro minimal code commenting.

