This version shouldn't be linked, even though it's the canonical one everyone links, because it's bad. If you check it against the original paper (https://gwern.net/doc/cs/algorithm/1982-perlis.pdf) you can see it has transcription errors and omits a bunch - it just drops the last 10! Perlis goes to 130, not 120!
Ok, I changed to that from https://cpsc.yale.edu/epigrams-programming on the grounds that accuracy and completeness are more important than (somewhat) degraded readability. Thanks!
Maybe that’s somewhat reasonable (?) given the last nine are epigram epigrams, not directly about programming…
If there are epigrams, there must be meta-epigrams.
122. Epigrams are interfaces across which appreciation and insight flow.
123. Epigrams parametrize auras.
124. Epigrams are macros, since they are executed at read time.
125. Epigrams crystallize incongruities.
126. Epigrams retrieve deep semantics from a data base that is all procedure.
127. Epigrams scorn detail and make a point: They are a superb high-level documentation.
128. Epigrams are more like vitamins than protein.
129. Epigrams have extremely low entropy.
130. The last epigram? Neither eat nor drink them, snuff epigrams.
> 10. Get into a rut early: Do the same process the same way. Accumulate idioms. Standardize. The only difference(!) between Shakespeare and you was the size of his idiom list - not the size of his vocabulary.
A beautiful thought and I would add that Shakespeare only wrote in english, developing his ability in that beautiful if messy language further and further, rather then trying all sorts of different languages to gain different perspectives on writing.
> 70. Over the centuries the Indians developed sign language for communicating phenomena of interest. Programmers from different tribes (FORTRAN, LISP, ALGOL, SNOBOL, etc.) could use one that doesn’t require them to carry a blackboard on their ponies.
You know, I dismissed these too easily in a different comment. Thinking of them as koans - containing contradictions, not holding truth, but revealing insights via the reader’s debate and struggle and resistance - makes me like them a lot more!
> 39. Re graphics: A picture is worth 10K words - but only those to describe the picture. Hardly any sets of 10K words can be adequately described with pictures.
How does this need to be rewritten in the sight of Dall-E, Midjourney and friends?
(ok, these days one could make a long, thin screen shot, but a non-trivial picture, that doesn't factor through a textual representation? Even representing the 8 strings given here via graphics sounds iffy to me...)
That epigram is referring to a fundamental asymmetry of information between words and pictures. Having a machine that can come up with many different pictures from a few words doesn’t only not say anything about how to make a picture from any 10K words, it somewhat reinforces the epigram’s point.
Those models may be more adequately described by #63:
When we write programs that “learn”, it turns out that we do and they don’t.
It doesn't need to be rewritten at all. I can ask Dall-E for a picture that conveys the meaning of the Sermon on the Mount, say, or of Einstein's 1905 paper on special relativity. Whatever I get will not convey the content that the sermon or the paper convey - not even close.
Please elaborate, I’m curious why. I was thinking of 1) the iPad (not to mention cell phones) 2) my kids spent a majority of their childhoods inside playing video games, unlike mine spent outside with kids from the neighborhood, and 3) Netflix, AppleTV, Amazon, Paramount, etc.. 4) Internet shopping and home delivery, 5) email.
Oh I tried to kick my kids out to play, it just didn’t work very well. The computer options I had as a kid were few and far between, and it was absolutely nothing like the vast sea of high quality entertainment they have on their multiple computer devices today. I got so bored with the sparse choices, I had to learn how to program my own games. My kids, on the other hand, had so many insanely good games and TV shows, they didn’t have the patience to learn to write code (until very recently) despite a lot of interest.
Nobody writes handwritten letters anymore (and even though it was fun, I’m not sure we should.) Movie theaters are dying since everyone can stream movies at home on big screen TVs (that are stuffed with computers now). Many many brick-and-mortar stores are closing their physical shops and going increasingly online. It seems like the computer has already changed the household immensely and irreversibly, not to mention the whole world we know.
Mostly really good, but I have a problem with this one:
> 67. Think of all the psychic energy expended in seeking a fundamental distinction between “algorithm” and “program”.
I mean... it's semantics but I feel like taking the bait here. I have a weird concept of fun.
I've always understood it as a program takes external input and/or gives external output (like keyboard presses, display to screen, write to disk, etc) whereas an algorithm is self contained and can run on any turing machine regardless of capabilities.
Basically "Side Effects", but I avoided using that phrase because input isn't clearly a side effect, but if your semantics don't require no side effects for an algorithm, then yeah a program is functionally equivalent to an algorithm.
Mostly semantics don't capital-M Matter, but can be important to clarify meanings. Depending on what project/team I'm working on I can have completely incompatible definitions of the same word in the same day.
I always thought it was obvious: a program is/contains a concrete implementation of an algorithm.
A recipe book contains an algorithm for baking cookies, but how you do it vs how someone else does it is going to have slight differences (accuracy of measurements, size of cookie, etc).