Hacker News new | past | comments | ask | show | jobs | submit login
The Humble Programmer (1972) (utexas.edu)
150 points by tareqak on July 27, 2013 | hide | past | web | favorite | 59 comments



Has anyone here actually learnt what Dijkstra advocates in almost all EWDs and used it to any benefit? Basically almost all his essays in one way or another end up being about the need to prove programs correct using formal logic and specifically a particular approach to correctness proofs he devised. In fact he even advocates starting from a formal logical specification of a problem and then deriving the program using purely syntactical transformations.

I tried to read the "Discipline of programming" where he explains his approach, but it was barely understandable and it takes 300 pages for him to get to the point of developing simple algorithms of the type you meet in the first chapter of an algorithm textbook. It could have been the translation (didn't read the English original), but I doubt it, because I have never read anything technical from him that would actually be interesting. I am afraid his essays are liked because of the general sentiment for "more rigour" in programming, whatever it would mean, and not because of any understanding of what precisely he advocates and the merits of his techniques. The living proof is some comment in this thread how Dijkstra sheds insight into the value of TDD...

So, if you upvote his articles, what precisely have you learned from Dijkstra?


You need to put it in historical perspective. Essays like this one were written at the dawn of the "structured programming" debate. If you're programming in a language and codebase where goto is rare or non-existent, then Dijkstra's ideas have had an impact on your environment.

More precisely he is always a champion of the idea of having programs express things in a direct and understandable way which makes comprehension straightforward. And a champion of getting there by removing unneeded features from languages.

This was a necessary overcorrection to now-forgotten excesses. But today it is clear that he went too far. As an example, I would consider Go's a language which is heavily influenced by Dijkstra's ideas on the value of simplicity in language design. However the internal loop exit implied by break exists. Multiple return values are used to separate out data (the result of a calculation) from metadata (the presence and details of errors). We use features beyond what Dijkstra considered wise.


According to Daylight's book The Dawn of Software Engineering: From Turing to Dijkstra, Dijkstra actually advocated adding recursion to Algol 60 despite not having good reasons to use it himself. He felt that it made the language more general, while his detractors at the time felt that anything unnecessary (especially anything potentially having a negative performance impact) amounted to waste. Dijkstra was a vocal opponent of putting machine-specific stuff in the language. History seems to have borne out both.

I think it's a bit of a dilution of his message to say he was a champion of comprehensibility. He was more interested in correctness, and his dream was that formal methods would be foundational to software development. That clearly didn't happen.


There are hundreds of EWDs and I haven't seen a single one where Dijkstra would talk about making programming languages simple to make comprehension straightforward, this is something that modern programming "gurus" are more likely to say. The reason Dijkstra wants simple programming languages, and most EWDs are more or less about this, is because he wants to prove theorems about programs, which is possible only with a simple language. Not only that, as I said he wants to reduce program development to well defined transformations on a specification of a problem in the language of mathematical logic. His stance is very nicely summarized here:

http://en.wikipedia.org/wiki/The_Cruelty_of_Really_Teaching_...

You have to understand some mathematics and mathematical logic to really understand Dijkstra however, you cannot look at it from a pure software engineering background because you will only take away platitudes from what he says without actually understanding him. If you have some background you can read about his real technical ideas:

http://en.wikipedia.org/wiki/Program_derivation

http://en.wikipedia.org/wiki/Guarded_Command_Language

http://en.wikipedia.org/wiki/Predicate_transformer_semantics

This is the area Dijkstra worked most on during his research career. Those ideas actually go back centuries before anyone ever thought about structured programming. For hundreds of years there has been a fundamental debate ongoing on whether all kinds of reasoning can be reduced to some kind of calculi with well specified rules, which would be equivalent to being able to construct a (at least theoretical) machine for automating it. This is where computers come from, this goes back to Charles Babbage, to the works of Leibniz, Boole, Turing etc.

Dijkstra is in the same historical tradition. The program I understand Dijkstra has for Computer Science is similar to the program Hilbert had for mathematics with what is now called formalism, from which the work of Goedel and Turing sprung off. When programming was still mainly done by mathematicians this has been a lively research area, there were various systems of proving correctness invented, various ways of deriving programs, there has been a very heated debate on when proofs are needed and to what extent ordinary programmers have to master them etc. You can find loads of books and monographs written in the 70s and 80s about this. Now as far as research goes program derivation seems to be an almost dead topic, but there are still people today championing it in some alternative form. Richard Bird wrote a book called "Pearls of Functional Algorithm Design" where he derives algorithms using algebraical properties:

http://www.cs.ox.ac.uk/people/richard.bird/

Alexander Stepanov has been advocating something similar and wrote a book called "Elements of Programming":

http://www.elementsofprogramming.com/

https://www.youtube.com/watch?v=Ih9gpJga4Vc


There are hundreds of EWDs and I haven't seen a single one where Dijkstra would talk about making programming languages simple to make comprehension straightforward...

If that's the case then you need to read more closely. This discussion is sparked by http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW.... I'll focus on just two statements from this one which demonstrates the theme that I said was there:

Finally, although the subject is not a pleasant one, I must mention PL/1, a programming language for which the defining documentation is of a frightening size and complexity. Using PL/1 must be like flying a plane with 7000 buttons, switches and handles to manipulate in the cockpit. I absolutely fail to see how we can keep our growing programs firmly within our intellectual grip when by its sheer baroqueness the programming language —our basic tool, mind you!— already escapes our intellectual control.

It would be hard to find a clearer advocation of simplicity in programming languages.

But a clearer statement of his priorities - and why they are priorities - can be found here:

A study of program structure had revealed that programs —even alternative programs for the same task and with the same mathematical content— can differ tremendously in their intellectual manageability. A number of rules have been discovered, violation of which will either seriously impair or totally destroy the intellectual manageability of the program. These rules are of two kinds. Those of the first kind are easily imposed mechanically, viz. by a suitably chosen programming language. Examples are the exclusion of goto-statements and of procedures with more than one output parameter. For those of the second kind I at least —but that may be due to lack of competence on my side— see no way of imposing them mechanically, as it seems to need some sort of automatic theorem prover for which I have no existence proof. Therefore, for the time being and perhaps forever, the rules of the second kind present themselves as elements of discipline required from the programmer...

Your complaint is essentially that I focused on rules of the first kind described, when Dijkstra did a lot of work on rules of the second kind. If I were truly doing that then I'd be failing to understand him exactly as badly as you are in saying he was only interested in rules of the second kind while ignoring the fact that his largest concrete impact came from http://www.u.arizona.edu/~rubinson/copyright_violations/Go_T....

I recognize and value his work in both areas. However the question that was asked was about his impact on current programming practice. And there is no question that his ideas on structured programming have had more impact than his ideas on provably correct software.


I try to read this paper from the perspective of his overall work and writing, and I have read through a lot of the EWDs. There is an abundance of examples that his #1 concern was about making programming more mathematical. Hence I interpret everything that you are quoting as proposals motivated by the need he perceives of proving programs correct. That the industry trivialized his ideas to "don't use goto" does not mean this was Dijkstra's great point. This is in fact excellently summarized in the first link I posted. Meanwhile you make it sound like he was interested in contributing to software engineering like it is done today, which is not the case, he seemed to really despise a lot of the things which are now software engineering best practices, like unit tests.

What I actually hoped for is someone who really learned the way of writing programs Dijkstra advocated to some extent and their experience with it. I can't say I really understand how those derivations look like in practice.


I try to read this paper from the perspective of his overall work and writing, and I have read through a lot of the EWDs.

In other words you're missing the plain meaning of the sections that I quoted because you try to consider it in the light of http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EW... which was written a decade and a half later? Pardon me, but I won't be emulating your example.

The focus of Dijkstra's research career was how to make correct software. As he himself would claim, there are two halves to this process. The first is to limit ourselves to forms of writing code which are easy to reason about. The second is to actually perform that reasoning.

The part of his proposal that actually had an impact is the part which says that we need to focus on methods of expressing ourselves that are easy to reason about. The part that did not have a direct impact is the part which says that we need to perform that formal reasoning. The fundamental reason why not is that Dijkstra's reasoning assumed the existence of a consistent, unchanging specification. The real world does not work that way - computers exist to do what humans ask. And humans do not always ask for things that make sense.

This is not to say that this is the impact that he wanted to have - it is clearly not - but it is the impact that he did have.

However some of his other ideas have indeed found their way into practice, albeit in a muted way that he would have objected to. For example take unit tests, since you brought them up. He was against tests as part of including QA as an integral part of the programming process - if the programmer performed properly then that should not be needed. (Nice theory, fails in practice.) However today, well-designed unit tests do serve as a limited form of specifying exactly what a given piece of code is supposed to do, and verifying that it in fact does that. (I'm sure that he would say limited and inadequate. But it is better than nothing.)


In other words you're missing the plain meaning of the sections that I quoted because you try to consider it in the light of http://www.cs.utexas.edu/users/EWD/transcriptions/EWD10xx/EW.... which was written a decade and a half later?

This clearly shows you haven't actually studied his writings, because he has been raising the same points for many years in almost every EWD, OP is EWD340, so here is a sample from before this period:

http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...

http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...

http://www.cs.utexas.edu/users/EWD/transcriptions/EWD03xx/EW...

For example from EWD317:

If we take the existence of the impressive body of Mathematics as the experimental evidence for the opinion that for the human mind the mathematical method is indeed the most effective way to come to grips with complexity, we have no choice any longer: we should reshape our field of programming in such a way that the mathematician's methods become equally applicable to our programming problems, for there are no other means. It is my personal hope and expectation that in the years to come programming will become more and more an activity of mathematical nature.

For a programming language to be simple in the sense of being able to prove things is a completely different thing than for it to be simple in the sense of being easy to understand informally. Yes, bastardized versions of his ideas did make it into the mainstream, I doubt it's always even because of his direct influence and that's what the wiki article I posted three comments earlier is about.


it takes 300 pages for him to get to the point of developing simple algorithms of the type you meet in the first chapter of an algorithm textbook

That's impossible; Discipline of Programming has only 217 pages of text. Surely you've been so enthralled by your reading that you didn't notice skipping through the covers to the book laying below!


I (re)learned programming this way. I studied CS at Eindhoven University of Technology, where Dijkstra ran a research group for years.

You need to understand that Dijkstra's primary research focus was algorithms. Mathematically deriving algorithms from formal specifications isn't a weird idea at all. Sure, it feels cumbersome when the algorithm you're deriving is a commonly known one, like a mergesort or a binary search. The idea is that you can derive other, more specific algorithms in the same way, and you'll have proven their correctness.

These days, algorithms are not a major part of most computer systems. Trying to apply Dijkstra's approach to a user interface or a JSON parser makes little sense. But this does not invalidate the method for its intended domain. It also does not make Dijkstra's ramblings misguided nonsense.


"These days, algorithms are not a major part of most computer systems."

I disagree... While GUI developers don't worry about O(n^2) versus O(n log n) performance, sites like Google wouldn't respond in real time without a focus on the best underlying algorithms.

I agree with the rest of what you say.


How to get a rat out of a maze. http://www.cse.usf.edu/~rtindell/courses/DataStructures/Proj...

On a serious note, you do have a good point. I haven't read much of Dijkstra's work myself.


Well, he didn't derive his algorithm using the approach he advocated, here is what he himself has to say on it:

One morning I was shopping in Amsterdam with my young fiancée, and tired, we sat down on the café terrace to drink a cup of coffee and I was just thinking about whether I could do this, and I then designed the algorithm for the shortest path. As I said, it was a twenty-minute invention. In fact, it was published in ’59, three years late. The publication is still readable, it is, in fact, quite nice. One of the reasons that it is so nice was that I designed it without pencil and paper. I learned later that one of the advantages of designing without pencil and paper is that you are almost forced to avoid all avoidable complexities. Eventually that algorithm became, to my great amazement, one of the cornerstones of my fame. I found it in the early ’60’s in a German book on management science - - “Das Dijkstra’sche Verfahren.” Suddenly, there was a method named after me

Source: http://conservancy.umn.edu/bitstream/107247/1/oh330ewd.pdf


tl;dr Programming has been ignored because hardware has a much more visible payoff. The impact of computers and the innovation in hardware will "be but a ripple on the surface of our culture, compared with the much more profound influence they will have in their capacity of intellectual challenge without precedent in the cultural history of mankind." That is, the intellectual impact of programming is more significant than the impact made innovation on the hardware side. At least thats what I think what Dijkstra is saying.

Some gems:

Test driven development, 1972.

Today a usual technique is to make a program and then to test it. But: program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence. The only effective way to raise the confidence level of a program significantly is to give a convincing proof of its correctness. But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer’s burden. On the contrary: the programmer should let correctness proof and program grow hand in hand.

For loops have brain damaged us.

Another lesson we should have learned from the recent past is that the development of “richer” or “more powerful” programming languages was a mistake in the sense that these baroque monstrosities, these conglomerations of idiosyncrasies, are really unmanageable, both mechanically and mentally. I see a great future for very systematic and very modest programming languages. When I say “modest”, I mean that, for instance, not only ALGOL 60’s “for clause”, but even FORTRAN’s “DO loop” may find themselves thrown out as being too baroque. I have run a a little programming experiment with really experienced volunteers, but something quite unintended and quite unexpected turned up. None of my volunteers found the obvious and most elegant solution. Upon closer analysis this turned out to have a common source: their notion of repetition was so tightly connected to the idea of an associated controlled variable to be stepped up, that they were mentally blocked from seeing the obvious. Their solutions were less efficient, needlessly hard to understand, and it took them a very long time to find them.


>I have run a a little programming experiment

Does anyone know what these test problems and solutions were? Or have similar examples?


I came away from that paper with notes to track down exactly the same information. Alas, there does not seem to be a definitive answer. Googling a unique phrase from the relevant text, this seems to be the main discussion:

http://www.techques.com/question/13-113751/What-task-did-Dij...

The "answer" is the classic Dining Philosophers problem, which Dijkstra himself invented to illustrate the issues of concurrent programming. It isn't clear that this is actually what he meant, but it's probably the best we are going to get now that he is gone.


The thing that comes to mind is the C idiom for copying strings:

while(dest++ = src++);

"...their notion of repetition was so tightly connected to the idea of an associated controlled variable to be stepped up, that they were mentally blocked from seeing the obvious."


I'm not sure I agree; aren't dest and src still variables that are being stepped up? Maybe they are not "controlled variables", I don't know what that means. This is a task that is eminently suited to repetition, any loop makes sense. Plus, the article mentions FORTRAN's DO loop (not just for loops incrementing a counter).

I was expecting an example where a loop worked, but a simpler mathematical solution exists without a loop.

Anyone else?


Is this example really relevant today? Who cares if you write while(dest++ = src++) or for (int i = 0; i < src.length; i++) { dest[i] = src[i] }; Was he just worried that the number of keywords and programming concepts were to many and made the language to complex? He would loath C++ then... Still, we seem to get by.


That point is really helpful. Reinforces that point that TDD is not just testing, it's forcing you to think differently.

Conceiving simple tests to prove a simple piece of programming works, keeps it simple. Simple is powerful.


I think you're misinterpreting what Dijkstra said and getting it almost backward. His point is that testing is inadequate: only a convincing proof of correctness is sufficient to conclude that software works without error. In his words:

> program testing can be a very effective way to show the presence of bugs, but is hopelessly inadequate for showing their absence

But proofs are hard. Therefore he advised that the program and its proofs (he didn't say tests) be created together so that the program could be constructed in such a way as to make the proofs easier.


I find his point about the "economic need" for programming to be more efficient interesting: at the time, software was about as expensive as hardware, and hardware was about the get drastically less expensive, and so

  If software development were to continue to be the same clumsy and
  expensive process as it is now, things would get completely out of
  balance. You cannot expect society to accept this, and therefore we
  must learn to program an order of magnitude more effectively.
And yet, society has accepted it. It's now a truism that programmers cost more than hardware. Then again, it doesn't seem like his hoped-for revolution has occurred, either, so I guess he hasn't really been disproven and is merely guilty of being too optimistic.


These are just engineering tradeoffs. Sure you need to weigh them carefully, but the fact that we spend more on software than hardware does not indicate a horrible failure in the industry.

For example, processing power, communication speed, and storage has improved by orders of magnitude, but battery capacity is perhaps better by one factor of ten. Much of the advance in battery life comes from reduced power consumption by other components. Much of the advance in software development has come by creating inefficient abstractions that let us create things faster but squander CPU, network, and battery life.

Most companies today are willing to burn up some hardware performance in order to reduce software development time. If they're burning too much, they'll fix it or they'll go out of business. Premature optimization is still not a good idea.


I guess my post sounds more pessimistic than I meant it to be. I don't think it indicates any horrible failure, I just find funny the exact way that Dijkstra's vision failed to come to pass. Honestly, as a programmer, I kind of like it that hardware is cheap and programming is expensive. ;)


This statement dates back to a time when people would buy a machine and hire some programmers to write a program for it; that is, it's about the enormous expense of one-off, in-house, custom programs. IBM's System/360 was around but portable operating systems hardly existed yet (In 1972, Unix was just getting started).

Since then, programs are shared and run on many more machines; the cost is amortized through portability and writing general-purpose tools. Dijkstra was right to say that something had to change, but he didn't anticipate the efficiency gains from sharing portable software, at least not in this essay. Instead he seems to have thought that programmers should learn to write programs faster. We've made some progress on that, but the great efficiency gains happened at a higher level.


It's sort of swinging back around with the scale of some of the systems in place today. Relatively small increases in program efficiency can save a lot of money in power consumption, physical server space needed, and other related data center costs.


An old friend, so many postings, so little discussion. Given the usual high standard of discussion here on HN, it's a shame that there's been so little about this popular submission. FWIW, I've upvoted this, because I really do want to see a balanced discussion about it from an up-to-date viewpoint.

It was the Turing Award Lecture in 1972 - it's over 40 years old. Printed in "Classics in Software Engineering" by Yourdon Press, 1979, ISBN 0917072146.

HTML: http://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340....

PDF: http://www.cs.utexas.edu/~EWD/ewd03xx/EWD340.PDF

Here are some of the previous submissions here on HN:

https://news.ycombinator.com/item?id=86288

https://news.ycombinator.com/item?id=109724

https://news.ycombinator.com/item?id=126638

https://news.ycombinator.com/item?id=135111

https://news.ycombinator.com/item?id=156505

https://news.ycombinator.com/item?id=449806

https://news.ycombinator.com/item?id=1179277

https://news.ycombinator.com/item?id=1649246

https://news.ycombinator.com/item?id=1672262

https://news.ycombinator.com/item?id=1799296 <- 3 comments

https://news.ycombinator.com/item?id=1894784 <- 8 comments

https://news.ycombinator.com/item?id=2011732

https://news.ycombinator.com/item?id=3060828

https://news.ycombinator.com/item?id=5035560

https://news.ycombinator.com/item?id=5266220

https://news.ycombinator.com/item?id=6112467 (This item)


I dunno about others, but I am a slow reader, and this will probably take me about 20 minutes to finish reading. I have a general rule on sites like HN and Reddit, if it takes more than 10 minutes to read something I will probably not going to read it (unless it's extremely interesting, important or relevant to present time based on comments).

I spend about an hour on HN a day, in small bouts. Most of the time I will skim over titles to see if there is anything interesting, if there is and its a long read (like this one) I will save it to "Pocket" to read it later. I will probably get around to reading this few weeks or even a month from now.

I suspect, very few people will read it on the spot, which is why even interesting submissions like this doesn't get a lot of discussions.


Ditto. This sounds like a great article, but it looks immensely dense. It's most definitely on my "To-read" list, but that rarely gets actually read through...

Also, although the author carries immense weight, and I'm sure the ideas put forth in the article are fantastic, what benefit will I get out of the article other than "hmmm, well that was interesting"? It's not a tutorial on a tool I can put into practice, or necessarily any sort of technique I can use in my day-to-day life (at least before reading, it seems this way). It looks like a philosophical article about the art of programming and being a programmer.

I love these kinds of articles, but their return on the large time investment required is ultimately not very large. This is in my relatively limited experience, of course.


These are two of the most depressing comment I've ever read, not just on Hackernews, but on the whole internet. So much so that I wonder if they were written with sarcasm that escapes me.

They represent a philosophy that is referenced often but always before as an unintentional consequence of our internal laziness -- a laziness that should be fought against. However here it is being presented as a conscious choice. Indeed a right, proper, and good, choice.

The commenters seem to be arguing that reading is only worth the time if the content has been distilled to its basic facts, and further that that facts need to be immediately actionable. Have we no room for soul? Do we lack the energy to take general concepts and apply them to new areas in new ways?

When we break a larger writing down and extract just the main theses, we make it easier and quicker to under understand, but we also neuter and even change the meaning. Sometimes what we learn or what we experience is subtle. Sometimes writing doesn't give us a todo list, but instead it ever-so-gently shades and nudges all our todo lists.


> The commenters seem to be arguing that reading is only worth the time if the content has been distilled to its basic facts, and further that that facts need to be immediately actionable. Have we no room for soul? Do we lack the energy to take general concepts and apply them to new areas in new ways?

You are reading too much into it. I (and probably the person you replied to) have nothing against long-form articles, as a matter of fact I prefer them.

On a typical day, I will spend 1-2 hours on a book, I will read many smaller articles here on HN and on Reddit, I will also check out my RSS reader, I will read work-related email, I will do actual work, I will train for my marathon (alternate day running and weights) which takes about 2 hours, spend time with my family, socialize, and hopefully get some sleep too. Its all about how you manage your time, not about distilling long-forms into bite-size chunks.

If given a choice between reading a long-form article online or reading a book, I will read a book.

There is only so much time in a day. There is so much to do. I save long article like this for my lazy or slow days to read.


Fair enough. Obviously I don't know your situation and I directed my rant too much at your specific comment and not the general issue that I really wanted to speak to.


The first comment suggests that the reason there aren't a lot of comments is because people save the article to read later. That really has nothing to do with the philosophy you're talking about.

As for the second, there's nothing wrong coming to HN for a certain kind of article, and overlooking others kinds. Do I have to read every article, on the chance that it affects me in a positive way, in order to avoid criticism?


> I suspect, very few people will read it on the spot, which is why even interesting submissions like this doesn't get a lot of discussions.

Unfortunate, but true. I prefer to batch up my online reading time so that I can get through the denser articles and still have time to read the five minute blog posts.


It's indeed a classic. I think one reason why there's relatively little discussion is that in the intervening years Dijkstra has been proven so correct that there's relatively little to say.

But I suspect that the main reason why it's hard for most people to talk about his work is that they lack the historical context to understand what he's talking about. For someone who has never studied or used FORTRAN, ALGOL 60, or any of the macro languages he alludes to, his criticisms seem very abstract. For someone who has seen those languages, they're concrete and visceral.

Looking back twenty years ago, I now understand that one of the highest-value courses in my CS degree was the "Survey of Programming Languages" class. We spent two to three weeks studying and working with each of LISP, FORTRAN, ALGOL-60, Smalltalk, and a couple others I don't remember. I enjoyed the class but at the time I wasn't developed enough to think more than "man, people sure have come up with some strange ways of doing things; ok, back to C, I love my filesystems class."

Now I recognize how precious that exposure was. It's kind of like traveling; exposure to different cultures teaches you as much about your own culture as about theirs. You don't always realize what assumptions you're making until you see other people making different assumptions.

This is the value of formal CS education, i.e. Dijkstra's life work. Some people ask whether a CS degree is worthwhile when the Internet makes it so easy to learn how to program. It's the difference between university and vocational training; if you only want fix cars, you just need some skills classes and time spent apprenticing in a mechanic's shop. If you want to be an automotive engineer, you need an engineering education. If you just want a job, you don't need college; if you want to participate in the core of what our civilization has to offer, you need an education (self-study or formal, doesn't matter).

(Tip for the kids in school today: If you're at university and you find yourself frequently saying "Why do I have to learn this crap? I'll never use it!" then you might just be wasting your time and money. Do yourself a favor and drop out, unless someone else is paying for your play time. But if you learn for the sake of learning, if you recognize that learning how to learn, to prepare for a lifetime of learning, is the point of education, then stay in school, since you'll reap the rewards many times over.)


A recent version of programming languages survey class available to anyone is the Coursera class on Programming Langauges from Dan Grossman:

https://www.coursera.org/course/proglang

It will be taught again in the fall. He covers three languages (SML, Racket, and Ruby) in ten weeks, hitting three of the four quadrants on strongly typed/dynamic and functional/object oriented.


> Printed in "Classics in Software Engineering" by Yourdon Press, 1979

It's interesting how terminology has changed. In the 1970s you might have been able to say that Dijkstra was part of a field called "software engineering", in that he was writing about methodology for developing software. But in the 1980s it became a fairly different field, focused more on how to develop effective processes for large-scale development (modeling, testing, code review, team structure, tooling, etc.). By 1988 Dijkstra was vehemently against that version of software engineering.

Here's his article on that subject: http://www.cs.utexas.edu/~EWD/transcriptions/EWD10xx/EWD1036...

A number of these phenomena have been bundled under the name "Software Engineering". As economics is known as "The Miserable Science", software engineering should be known as "The Doomed Discipline", doomed because it cannot even approach its goal since its goal is self-contradictory. Software engineering, of course, presents itself as another worthy cause, but that is eyewash: if you carefully read its literature and analyse what its devotees actually do, you will discover that software engineering has accepted as its charter "How to program if you cannot.".


I reread this piece every few years and agree that there's no reason this discussion can't occur every few months/years on HN (so I'm upvoting this too).

And since there's a lack of comments on many of the threads, I'll add my opinion to this one. This was written by a man who's got a long-term love affair with programming. I recognize this because I am the same way, and it's actually rarer than you might think.

Most of the people who were my colleagues have gone on to management, or more likely switched to other careers. I've been in management repeatedly (even C-level) and have had to talk my way back into software architecture and development. I wrote my first program 41 years ago and it's still my favorite past-time.


Maybe it is so because this is a very wordy article with Dijkstra having just his usual axes to grind? The debate about this article should be a debate about the merits of proving program correctness, even though people nowadays read in Dijkstra many other things he probably did not even mean; anyway, this debate has been done to death in the period when there was still any controversy about using it as a technique for day-to-day programming.

I think the problem with Dijkstra is that A) he views every program as a huge algorithm, where 90% of programming now is doing IO and user interfaces B) he doesn't take into consideration that people can have different brain wiring than he has and be effective with other techniques that the ones he advocates.


Any content aggregator where stories float to the top but are increasingly depressed according to the passage of time suffers from this. Articles requiring a few minutes are ideal, while longform essays are usually marked for later reading. By the time someone has a comment, the thread is buried under mountains of short, easily-digestible "Reasons I Like Go" and "New MySQL Vulnerability"-type posts.


maybe HN is not what you are taking it for?


Enlightening, thanks for posting.

"[LISP] has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.", that's pretty profound ;-)


"[The speaker] managed to ask for the addition of about fifty new “features”, little supposing that the main source of his problems could very well be that it contained already far too many “features”. The speaker displayed all the depressing symptoms of addiction, reduced as he was to the state of mental stagnation in which he could only ask for more, more, more... "

LOL


Has anyone here studied under Dijkstra?

I only know of one programmer who did - one of the best hackers I've ever met, who eventually dropped out of UT to program full time. I never asked about his opinions of the professor, and anything I could write would be second hand.

I had a professor out of UT for my programming languages class. When asked, "How do you go about debugging interpreters?" he responded, "I don't debug. I prove every line of code correct before I type it."

Was Dijkstra inspiring? Or just a pedantic fuddy-duddy?


This article feels like it's from a completely different era -- an early age of computers with wild discoveries and unexplored frontiers.

But in reality it wasn't that long ago and a lot of the early computer pioneers are still alive. Computing is still a really young field!


One of my professors, Prof. Vineeth K Paleri, at NIT Calicut (http://cse.nitc.ac.in) asks the students to read this article by Djikstra in their first class with him. Was an interesting read indeed.


What's interesting to me is that Dijkstra (and Wirth for that matter) didn't particulary like Lisp or were very late to appreciate it.


>The third project I would not like to leave unmentioned is LISP, a fascinating enterprise of a completely different nature. With a few very basic principles at its foundation, it has shown a remarkable stability. Besides that, LISP has been the carrier for a considerable number of in a sense our most sophisticated computer applications. LISP has jokingly been described as “the most intelligent way to misuse a computer”. I think that description a great compliment because it transmits the full flavour of liberation: it has assisted a number of our most gifted fellow humans in thinking previously impossible thoughts.


A really interesting read! I am glad I read this very long article. Worth the time!


A tldr version please?


Sorry, this ain't Reddit.

If it is too long for you, then it is only good you don't waste your precious little time on this planet reading it.


Sorry, but td;lr has been part of this site for years, and there is nothing wrong with asking for a summary of a complex essay. Indeed a good summary often sparks interesting conversations.


Thank you for being considerate.


A humble man speaks about programming.

Take some time, this one is worth stepping back from the Internet firehose of information and sipping the lemonade of prose.


That's a joke right? "Dijkstra" and "humble" just don't go together: http://www.youtube.com/watch?v=9KivesLMncs


One would need much more than the OP to know that--hence much is lost by a tl;dr abridgment ;).


Since "tldr" means "too long didn't read", maybe you need some of that humbleness yourself?


It has nothing to do with humbleness.


I'd appreciate a tldr version too.




Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: