Hacker News new | past | comments | ask | show | jobs | submit login
How do we tell truths that might hurt? (1975) (utexas.edu)
60 points by giocampa on Oct 14, 2020 | hide | past | favorite | 73 comments



Personally, I've always thought this letter was the plainest proof that Dijkstra's pithy quotes (and those that parrot them) should not be taken seriously, no matter how entertaining they are. That list covers most the major languages of the day; what he would have to say about $(YOUR_FAVORITE_LANGUAGE) would surely be equally unkind were he alive today.

That isn't to say that he did not have valid reasoning for his dislikes as a language expert and a mathematician but those reasons are not ones the bulk of practitioners would likely agree are valid.


Everything Dijkstra says makes sense if you believe that computing cannot scale without correctness. He was wrong about that one thing, and to be fair to him, it's still a little amazing in retrospect, even when we can look back over the decades of history through which we've created a society dependent on vast, unimaginably complex, world-spanning computing systems built out of pieces that are virtually all broken.

And it's not like he was a lone crank railing against the 99% who were moving forward with a well-founded idea of how computing could scale indefinitely by composing incorrect programs. Just like today, 99% of programming, including programming for government programs and vital infrastructure, was done by people who were only hoping to make their own projects succeed well enough for the next six months to three years. I'm sure there were brilliant people who made intelligent arguments against the need for correctness, but their arguments didn't carry the day. Complacency and short-term thinking did.

In that context, Dijkstra's pessimism and his use of harsh, attention-getting language makes a lot of sense. How many people at the time really understood that in 2020 every part of our civilization would depend on code compiled by compilers with bugs, linked with libraries with bugs, in virtual machines with bugs, on operating systems with bugs, on CPUs with bugs in their microcode, and yet it would still all mostly work?


> if you believe that computing cannot scale without correctness. He was wrong about that one thing, and to be fair to him, it's still a little amazing in retrospect,

That is clearly wrong if one is willing to take a moment to stop gazing at the wonder of pure mathematics and look at the outside world. There is no notion of "correctness" for the pyramids of Egypt, the dykes of the Netherlands, Milan Cathedral, or the world economy and yet those huge-scale systems all function.

> Dijkstra's pessimism and his use of harsh, attention-getting language makes a lot of sense.

It makes sense in terms of Dijkstra's personality, but it makes no logical sense. If you believe mathematical correctness is such a valuable principle, then you should be able to leverage that same principle in one's own arguments. The fact that Dijkstra couldn't dispassionately prove that correctness was vital and had to resort to emotionally-loaded weasel words undermines his own claim.


> That is clearly wrong if one is willing to take a moment to stop gazing at the wonder of pure mathematics and look at the outside world. There is no notion of "correctness" for the pyramids of Egypt, the dykes of the Netherlands, Milan Cathedral, or the world economy and yet those huge-scale systems all function.

Wow, couldn't disagree with this more, at least on your examples of civil engineering (buildings, dikes). There are testable, comprehensive physical principles that govern whether any of these engineered products function in their most fundamentally-intended ways. This is the reason most buildings are resilient and don't collapse under load, or that dams keep water from flowing uncontrolled. You can debate "correctness" in the sense of the purpose the product serves, but there is the principle of correctness of construction which your civil engineering examples (and anything truly engineered) satisfy. Correctness in construction is not subjective.


> Correctness in construction is not subjective.

Perhaps, but it's never certain. So things that are "correct" can still be wrong.

Here's how Einstein expressed his disagreement with what you are saying:

> As far as the laws of mathematics are certain, they do not refer to reality, and as far as they refer to reality, they are not certain.

Nothing in physics (or any other science) is certain. All one can do in science (other than mathematics) is disprove. One cannot prove.

To prove you would have to be all knowing. You would have to have taken into account all relevant aspects of all physical characteristics. For example, all known aspects of quantum mechanics, including the uncertainty principle, AND all relevant unknown aspects of quantum mechanics, which is guaranteed to be hit and miss, and can rationally be expected to contain an unknown number of misses that isn't zero.

It's the human propensity to ignore these fundamentals that leads to things like the Tacoma Bridge collapse.[0]

[0] https://en.wikipedia.org/wiki/Tacoma_Narrows_Bridge_(1940)


> There are testable, comprehensive physical principles that govern whether any of these engineered products function in their most fundamentally-intended ways.

These are statistical engineering tests of the probability of failure under certain conditions. That is not at all what Dijkstra would consider to be "correctness". Dijkstra is talking about mathematical proof. In mathematics, one does not say "1 + 2 = 3 plus or minus 0.1 with a safety factor of 2".


When we determine if an aircraft trims, or a boat floats, it is not correct to say it is statistical engineering test of the probability of failure. You formalize the properties the system must have in order to not fail, and you use conservation equations to ultimately compute whether or not the system satisfies those properties. Margins are used to account for parameters that have uncertainty attached. All of these elements are subject to symbolic mathematical formalism. One can quite clearly state the inequalities that must be satisfied to be, e.g., controllable.

It's unclear how this would be any different than mathematically formalizing a distributed system, identifying the properties that constitute correctness of operation of that distributed system, and symbolically proving that subject to certain assumptions, the distributed system model does or does not satisfy those properties. This would presumably be consistent with the Dijkstra view of mathematical correctness.


> There is no notion of "correctness" for the pyramids of Egypt, the dykes of the Netherlands, Milan Cathedral, or the world economy and yet those huge-scale systems all function.

I think it's impossible to come up with a rational judgment of whether the world economy "functions." I mean, those of us who are alive are alive, but there are a lot of things that have happened that haven't been ideal.

Besides the world economy, nothing you mention has the functional complexity of a large piece of software, and none of it was designed and built with much confidence, other than confidence that came from similarity to previous designs, and from an optimistic outlook that issues encountered during construction could be coped with. This wall is cracking, so we tear it down and rebuild it a little thicker. We have a few centuries, after all. I think you're right that modern software projects are analogous to ancient pyramids and medieval cathedrals. That's pretty much where we are. Yet somehow, working at that low level of sophistication, we've built working software that is orders of magnitude more complex than cathedrals.

I'm not saying I know how to improve on the situation, or even that we can; I'm just pointing out how different it is from, say, contemporary structural engineering where we can analyze the design of a novel structure and have a pretty good idea of whether it will be safe based on its geometry and known characteristics of materials and how they're joined. It's amazing that we've reached the scale we have with nothing comparable.

> If you believe mathematical correctness is such a valuable principle, then you should be able to leverage that same principle in one's own arguments.

The idea that mathematical reasoning is on some spectrum with dispassionate discourse has motivated hundreds of years of attempts to improve natural language by making it more mathematical. Result: some interesting contributions to philosophy but very little change in how people communicate. People are still people.


Imo, the real issue is that people seem to think that smart mathematicians or scientists don't argue like normal people, don't get emotional when arguing and always talk from divine truth.

The issue have nothing to do with mathematical correctness, so you can't apply the same principles. This is not math. But, Dijkstra's it's not being logical in arguments, because he is not of dick and likely this style of arguing was working for him in his life.


Good points. I think a way of looking at it is that much of software development we do today is about communication, not engineering. Think of web-sites they are all about communicating something. And in a sense when you write a program you do so to communicate with the computer.

Of course there are critical software engineering projects as well, Boeing MAX comes to mind.


The context of this EWD is somewhat lost to history, because Dijkstra's point of view in many cases won so thoroughly that we cannot conceive of what he was describing.

The BASIC he is describing had no call stack, nothing that would resemble a function in today's languages. FORTRAN at the time handed masses of state around in global variables and the latest version had just added subroutines and functions. This is after ALGOL 60 had established the model that we all take for granted today.

I think the bulk of practitioners today would absolutely agree with him.


Back in grad school, I had the pleasure of working with some very old FORTRAN. Written conservatively using a few bits of FORTRAN 77, but mainly in a FORTRAN 66 style. The control structures I found in there would make you go crosseyed. At one point, there was a jump that took you from outside a loop to the middle of the body of the loop. Today's no-longer-all-caps Fortran manages to elude EWD's criticism by not being that language any longer. You have to use a lot of obscure compiler flags even to make the old stuff compile.


Having written code in the early microcomputer BASICs, I'm reasonably familiar with the context but I think it's a bit overblown. Sure, BASIC was limited in the syntactic sugar that was available to manage program complexity but, then again, the microcomputers of the era were so limited that complexity wasn't really possible without dropping to machine language anyway. And machine language definitely didn't have any of the syntactic sugar of ALGOL 60 or other high-level languagess.


As a massive fan of Dijkstra myself, true but up to a point.

About programming languages I don't think there's any question he was right. Sometimes we agree with him even without realizing it, for example, the original argument of GOTO considered harmful was that GOTO statements decreased "linearity" by having the control flow jump to random places. The recent trend in mainstream PLs to adopt functional-style control structures (JS array methods, Java streams, etc.) is predicated on the exact same rationale.

I'm also in broad agreement on his stance about natural language being just flat out wrong in the context of computing.

But the man made an exorbitant amount of claims, many of which are just provably wrong. The bits about BASIC and COBOL are particularly egregious examples of his at times cavalier attitude. After all, arrogance is measured in nano-Dijkstras.


The goto thing is one of my pet peeve, somehow it caught on, maybe it made people feel good so they can look down on others. There is basically 2 arguments against it: Knuths article arguing for goto (read it if you havent), and the observation that gotos and state machines (which are considered best practices) are basically equivalent (just convert the goto labels to your states).


I'm familiar with Knuth's argument. That argument was primarily based on the fact that it's hard to replicate the exact control flow of a goto statement with structured programming. While the argument certainly has merit (a relatively more recent example is Duff's device), 2020 compilers are more often than not able to find the most optimal structure anyway, making it largely redundant. It's like saying programs should be allowed to rewrite their machine code just because in a Von Neumann architecture you are technically allowed to, there is a point but it's not in contradiction with goto statements being mostly the inferior way to solve the problem. From a 2020 perspective I'd argue that if you really need that kind of performance you are probably already inlining assembly, so it's kind of a moot point.

> gotos and state machines (which are considered best practices) are basically equivalent

WTF? State machines can be implemented in a lot of ways: mutually recursive calls, pattern matching, function pointers... In what sense are they "equivalent" to gotos? Also what does it mean that state machines "are considered best practices"?


Yeah. Much of this list simply comes off as arrogant and narrow-minded. Whether or not it was tongue-in-cheek, I don't know. This one in particular:

> It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.

is at best a hyperbolic joke. I dearly hope he wasn't being serious.


Dijkstra provides glimpses of his reasoning scattered across his missives. For example, for FORTRAN in https://www.cs.utexas.edu/~EWD/transcriptions/EWD03xx/EWD340...

"The second major development on the software scene that I would like to mention is the birth of FORTRAN. At that time this was a project of great temerity and the people responsible for it deserve our great admiration. It would be absolutely unfair to blame them for shortcomings that only became apparent after a decade or so of extensive usage: groups with a successful look-ahead of ten years are quite rare! In retrospect we must rate FORTRAN as a successful coding technique, but with very few effective aids to conception, aids which are now so urgently needed that time has come to consider it out of date. The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use. FORTRAN’s tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes. I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility."


I don't think any language that could ever be invented could be described as leaving programming students "mentally mutilated beyond hope of regeneration", no matter how bad it is. So either he was being hyperbolic (and the sarcasm was lost on me), or he was being shockingly dogmatic.

It's the kind of ranting you do with your colleagues in a pub after-hours, not something you publish in a professional context and frame as "unpleasant truths", as if they've been proven through scientific measurement.


Tenured professors, particularly of EWD's stature, get a lot of license to say what they want. I can certainly a few of my instructors who were rather colorful in a variety of ways.


I've no doubt they have license to do so without damaging their careers. That doesn't make it a good take.


Because it makes you uncomfortable? Or because you have reason to believe otherwise?


Because any subjective assertion this narrow is going to be some amount of wrong.


Dijkstra was obsessed with mathematical correctness and certainly knew it was pointless to pursue in human language speaking about complex topics. He wanted people to pay attention, and they did. I think it's a testament to the effectiveness of his writing, and proof that it was not merely inflammatory, that people still read him and debate his ideas today, when his ideas have as little mainstream acceptance as they ever had.


If you're used to exclusively programming with line numbers and you're a kid, sure, it might take a little bit of time and instruction to comprehend that line numbers aren't necessary. But from what he wrote it just sounds like this guy is whining about (when faced with students like this) having to actually do his job.


Fortran and BASIC at the time were, in many ways, the antithesis to his ideas of what a programming language should be.

They did not permit recursion (direct or indirect). Almost by design they encouraged spaghetti code (the thing Structured Programming was meant to work against). They encouraged (or required) global state. They discouraged modularity. They were effective languages (in the sense that people got stuff done in them), but they were poor languages when compared to the capabilities of their contemporaries.


I didn't have an impression that line numbers were important. Most of the time they did nothing and were unwieldy to maintain, they were more nuisance than anything else.


I disagree and I believe most developers would agree that it would have been a far better use of time to learn 6502 assembly than Apple BASIC, at least if one's goal was to actually learn how to program computers.

To this day if I saw 6502 assembly on a resume I'd be intrigued and if I saw BASIC I'd view it as a yellow flag.


Then you say "There are better options than BASIC for teaching new programmers." That's not what was written.

I've read countless anecdotes from perfectly capable programmers who not only learned, but first discovered their interest in the field, via BASIC. No language "mentally mutilates programmers beyond hope of regeneration". And BASIC probably isn't even the worst one for learning purposes.


He calls out BASIC specifically because it was ubiquitous and often the only option on the machines novices would have available at the time. It logically follows that most programmers of that era, capable or otherwise, started on BASIC. Since he’s not here to tell us, I can only assume Dijkstra met the occasional exceptional individual who began learning to program in a more mathematically rigorous language during that time and was left with a vastly more favorable impression.

Remember, Dijkstra was a professor and he was an expert on teaching programming. He would have far more experience with the effect he claims than I do. I hazard to guess that is true with respect to you too.

And yes he’s obviously engaging in rhetorical hyperbole in the text you quote.


> "He calls out BASIC specifically because it was ubiquitous and often the only option on the machines novices would have available at the time."

EWD's quote is from 1975 which is just barely the start of the microcomputer era (the Altair 8800 came out in '75 and the Apple II in '77), so he's not referring to those. Most people writing programs would have had access to mainframes and therefore had access to multiple language interpreters and compilers I would think.



The Altair 8800 came out in January 1975; Gates and Allen's Altair BASIC, the first microcomputer BASIC, came out "shortly thereafter" and EWD498 was written in June 1975. That's hardly long enough for an explosion. Again, Dijkstra probably had never even heard of either when he penned his missive and it's clear that his opinion of BASIC was well formed even before 1975.


People had to create specialized languages for learning purpose to compete with builtin graphics feature of basic.


Some of these have aged so perfectly that I only need substitute a few letters:

> Python —"the infantile disorder"—, by now nearly 30 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.

> It is practically impossible to teach good programming to students that have had a prior exposure to JS: as potential programmers they are mentally mutilated beyond hope of regeneration.

> The use of Java cripples the mind; its teaching should, therefore, be regarded as a criminal offence.

The quotes about languages were always controversial, weren't they? But it's clear now in retrospect what Dijkstra was complaining about. He found FORTRAN to trick people into thinking that programming was merely about specifying arithmetic operations in a certain order, considered BASIC to require mental models which rendered folks memetically blind to actual machine behaviors, and thought COBOL tried to be legible to management but ended up being confusing to everybody.

> Many companies that have made themselves dependent on AWS-equipment (and in doing so have sold their soul to the devil) will collapse under the sheer weight of the unmastered complexity of their data processing systems.

Yep.

> In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to Python, so that they can share each other's programs, bugs included.

Reproducibility is a real problem, and sharing code is just the first step. It's an embarrassment to physics and mathematics that we don't have a single holistic repository of algorithms, but have to rebuild everything from scratch every time. (Perlis would tell Dijkstra that this is an inevitable facet of computing, and Dijkstra would reply that Perlis is too accepting of humanity's tendency to avoid effort.)

> You would rather that I had not disturbed you by sending you this.

Heh, yeah, let's see what the comment section is like.


What a great comment. I'm sure it'll get flagged into oblivion, though - thus directly proving Dijkstra's point.


What's with the hate for python. I've noticed some hate this irl and it's strange. Python is one of the best languages out there right now in terms of ease of use and future maintainability (with types).


First let me state that I do not hate Python, but I would suppose 2 items are on the top of the list for people that do. The first and the one that makes me dislike Python (not hate) is that Python enshrined into the language syntax that whitespace and structure count towards correctness. If it is not properly spaced according to the spec it does not work. Some people see this as an arbitrary constraint put on a developer and do not like it. Now I will give you others see it as a way to keep code readable, both arguments have their merits. Pretty much anything beyond noting how the two camps see it devolves into a holy war. Personally being a person that sees the beauty of LISP derived languages lack of syntax, I fall into the first camp and don't see the value of it for me.

I think the second issue, is it is fairly safe to say that the path from Python 2 to 3 was not well thought out and has been a disaster. A lot of people where burned by it, and it left a bad taste in a lot of peoples mouths.

That being said, Python enjoys a huge userbase so I would not worry about the hate, it's just not the language of choice for some people and that is fine.


I disagree with a lot of this. I feel these are minor minor complaints. You hear this kind of stuff for lisp too. For example, keeping track of parens is just as annoying as keeping track of indentation.

Also the migration from python 2 to 3 is actually the best migration ever. Breaking changes on a code that has a massive library around it. All code is migrated along with the libraries. I haven't heard of any migration done as successfully as python. It required movement of the language and the entire community to make it happen.

Most migrations refrain from breaking changes and instead they take the less risky route and make the language backwards compatible at the cost of being more bloated. See C++ 17 and C++ 20.

But despite this, yeah these are complaints that I've heard, but this is totally unrelated to the hate I see. Like parentheses in lisp is something to complain about but not something to hate lisp for.

The hate I believe is more egotistical than anything. One interviewer told me I could code in any language other than python because python makes things too simple. The hate is because they believe python programmers are stupider than normal programmers.

That being said my favorite language is not python. It's my least favorite language out of all the languages I know well but not a language I hate.


I've written Python for over a decade. There are two problems with Python.

First, the excellent readability leads directly into hard-to-read code structures. This might seem paradoxical but Dijkstra insisted that the same thing happened in FORTRAN, and I'm willing to defer to his instinct that there's something about the "shut up and calculate" approach that physicists have which causes a predilection for both FORTRAN and Python.

Second, Python 2 to Python 3 was horrible, and created political problems which hadn't existed before. Now, at the end of the transition, we can see how badly it was managed; Python 2 could have been retrofitted with nearly every breaking change and it would have been lower-friction. Instead, there's now millions of lines of rotting Python 2 code which will never be updated again. Curiously, this happened in the FORTRAN world too; I wasn't around for it, but FORTRAN 77 was so popular compared to future revisions and standardizations that it fractured the FORTRAN community.


>Second, Python 2 to Python 3 was horrible, and created political problems which hadn't existed before. Now, at the end of the transition, we can see how badly it was managed; Python 2 could have been retrofitted with nearly every breaking change and it would have been lower-friction.

This doesn't make any logical sense. Your saying take the breaking changes in python 3 and put it into python 2? That's just a version number. You can call it version 2.999999.8 and do all the changes in there and the outcome is identical.

No. Every breaking change must have a downstream change in every library that uses that breaking change. That's the reality of breaking changes. No way around it.

Tell me of such a migration as huge as python 2->3 that was as successful. For sure there were huge problems along the way and it took forever. However I have heard of very very few migrations in the open source world that ended up with an outcome as successful as python.

>First, the excellent readability leads directly into hard-to-read code structures.

I don't agree with this either. You refer to fortran but most programmers here haven't used it so you'll have to provide an example for readers to see your point.


I'm not going to argue Python politics with you, but suffice it to say that only a few communities have had such a bad major version upgrade experience. Here are some off the top of my head for comparison, from roughest to smoothest:

* Perl 5 to Perl 6: So disastrous that they rolled back and Perl 6 is now known as Raku

* PHP5 to PHP7: Burn my eyes out, please! But of course PHP has unique user pressures, and a monoculture helps a lot

* Python 2.4 to Python 2.7: Done in several stages, including deprecation of syntax, rolling out of new keywords, introduction of backwards-compatible objects and classes, and improvements to various semantic corner cases

* Haskell 98 to Haskell 2010: GHC dominated the ecosystem and now Haskell 98 is only known for being associated with Hugs, which knows nothing newer

* C++03 and earlier to C++11: Failed to deprecate enough stuff, but did successfully establish a permanent 3yr release cadence

* C99 to C11: Aside from the whole Microsoft deal, this was perfect; unfortunately Microsoft's platforms are common in the wild

Now consider how many Python 3 features ended up backported to Python 2 [0] and how divisive the upgrade needed to be in the end.

On readability, you'll just have to trust me that when Python gets to millions of lines of code per application, the organization of modules into packages becomes obligatory; the module-to-module barrier isn't expressive enough to support all of the readable syntax that people want to use for composing objects. If you want a FORTRAN example, look at Cephes [1], a C library partially ported from FORTRAN. The readability is terrible, the factoring is terrible, and it cannot be improved because FORTRAN lacked the abstractive power necessary for higher-order factoring, and so does C. Compare and contrast with Numpy [2], a popular numeric library for Python which is implemented in (punchline!) FORTRAN and C.

[0] https://docs.python.org/2/whatsnew/2.7.html#python-3-1-featu...

[1] https://github.com/jeremybarnes/cephes

[2] https://github.com/numpy/numpy


"Failed to deprecate enough stuff"

Did you by any chance observe any of the folks involved officially saying that deprecating things was one of the goals? I thought keeping working code working has always been of the C++'s official goals.


I love this quote: "There are only two kinds of languages: the ones people complain about and the ones nobody uses". Bjarne Stroustrup.


There's a lot to complain about for python. But I see genuine hate. People, groups and companies who literally refuse to use it.

I had an interviewer tell me that I couldn't code up the solution in python. I think it might be because python is so easy that people look down on it.


There's a certain cultural subset which tries to bolster their self-assessed superiority by rejecting things which are popular. This is especially common for people coming of a certain academic bent who are constantly playing one-upmanship games desperately trying to be the smartest person in the room.

Python annoys those people because it's both relatively easy to get started with and far, far more successful than whatever their current favorite language is, and this is portrayed as people not getting it rather than having a more insightful discussion about whether other people might reasonably make decisions based on different needs, background, and resources rather than stupidity.


Your comment rings true as I'm one of those people who reject things which are popular. Thanks for pointing it out!


What modern language would Dijkstra approve of?


This is a very difficult question. We know somewhat his preferences, because he worked on implementing ALGOL 60 [0][1], but unfortunately we are blocked by a bit of incommensurability; in that time, garbage collection was not something that could be taken for granted. As a result, what he might have built in our era is hard to imagine.

That said, he did have relatively nice things to say about Haskell [2] and preferred Haskell to Java:

> Finally, in the specific comparison of Haskell versus Java, Haskell, though not perfect, is of a quality that is several orders of magnitude higher than Java, which is a mess (and needed an extensive advertizing campaign and aggressive salesmanship for its commercial acceptance).

I imagine that he would have liked something structured, equational, declarative, and modular; he would have wanted to treat programs as mathematical objects, as he says in [2]. Beyond that, though, we'll never know. He left some predictions like [3] but they are vague.

[0] https://en.wikipedia.org/wiki/ALGOL_60

[1] https://www.cs.utexas.edu/users/EWD/MCReps/MR35.PDF

[2] https://www.cs.utexas.edu/users/EWD/transcriptions/OtherDocs...

[3] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/E...


He would probably form a completely different opinion, the world is nothing like what could be anticipated 30 years go.

The first languages and their compilers were strongly driven by hardware constraints, a kilobyte of memory costing an arm and a leg.

Imagine storing function names in memory to compile the program, it doesn't fit in 1 kB. Imagine storing the whole source code in memory for processing, it doesn't work when there is less than a 1 MB of memory available.

It's ridiculous today but it's real reasons why things were made global back then or why C/pascal split the code between a header and a source file.


There are so many languages available today that I'm sure there are plenty he would have approved of. For example, I think he might have appreciated Zig. If you read his work it's pretty easy to see his top priority is managing complexity and limiting surprise.


> his top priority is managing complexity and limiting surprise

Zig doesn't do either of those things. There are a fair amount of criticisms of the mental model of the author that I've seen voiced - some including security.

What's worse, the community surrounding Zig (in particular, the Discord community) operates more like a cult - any negative questioning gets you shunned.

I was personally a huge fan of Zig until a number of questionable design decisions and dismissed bug reports lead me to believe it will forever remain a toy language. I can't imagine Dijkstra approving.


That’s too bad. I was judging based on the overview of Zig that was recently posted here. I gladly defer to your more informed opinion on the subject, but I’ll maintain Dijkstra would have liked the design goal of executing the procedure as written absent fancy obfuscated control structures.

The cult-like attitude that many programmers have about languages certainly supports Dijkstra’s claims about the immaturity of the field.


I suspect, none of them. If he did approve of one, it would probably be Haskell.

And, for all his complaining, I don't know of any language that he authored. He's sure good at telling everyone that they're doing it wrong, though...


His rant-y EWDs seem to be the only ones people know. But his others go into much more detail about what he thinks and why. And while he didn't (to the best of my knowledge) directly create any language, he did implement an Algol 60 compiler and was involved in language design efforts throughout his life.

He helped create the ideas of Structured Programming, which most of us now take for granted, since pretty much every language in popular use these days are based on these ideas.


Dijkstra co-created a rather influential language called Algol 60[1]. It and its immediate descendants are still used in scholarly CS work because it's so good for clearly describing algorithms. Past that, the influence of Algol 60 on virtually all modern programming languages is hard to overstate.

[1] https://en.wikipedia.org/wiki/ALGOL_60


Well, he co-authored a compiler. From what your link says, he wasn't part of the committee that created the language.


StandardML


Whichever one grandparent commenter likes best.


Your flame-bait revision is a perfect demonstration of what's wrong with the original post.


He wrote this in 1975, dissing most of the popular programming languages of the time. Does anyone know which language Dijkstra didn't think was a mess? (and please don't say GCL, his own unrunnable toy language)

I mean, if FORTRAN is a mess, then isn't ALGOL too? Is anyone here old enough to remember?


Well, Dijkstra together with Zonneveld wrote the first ever implementation of ALGOL 60, and, in fact, heavy-handed the recursive procedure calls into the language. If not for him, who knows, Quicksort may have not been invented for another decade or so.

ALGOL 68, on the other hand... "The more I see of it, the more unhappy I become." [0]

[0] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD02xx/E...


If I understand correctly, ALGOL had better ways of structuring code into functions than FORTRAN did. This let ALGOL do what eventually was called "structured programming", and FORTRAN do what was called "plate of spaghetti" programming.

This is not the same as saying that EWD thought that ALGOL was good...


I imagine he had at least some fondness for Algol 60 since he is co-credited with writing the first compiler for it.


That might give you fondness. Or disgust. In his case, I have no data about which one won.


Algol is quite different from Fortran.


Sure, so is COBOL. That wasn't my question.


Well, I believe the answer to your question is the edition of Algol he worked on. I personally think it’s one of the most elegant imperative language designs in history.


Let me share this on here cuz it might benefit someone going through hard times in his or her relationship. I was going through terrible times with my partner. I suspected he was cheating and I needed proof of this to back my claims. I couldn’t continue with this. All I wanted was to be sure of what was going on before I make any decision. I don’t make blind conclusions; I always want to see the proof for myself. I took the pain of searching for a professional IT expert who can help me spy and track all communication applications on his phone ( WhatsApp,Text messages, call logs and email). I was able to meet an amazing hacker named Webhubghost (@) gmaiilcom his hack services was professionally executed. All the hack was done remotely. He didn’t need physical access to the phone before it was hacked. Under 6 hours, the hack was done. Between these 6 hours, I was given updates about the progress of the hack work. This gave me rest of mind; it was very easy to trust his work. You can reach out to him at Webhubghost (@) gmaiilcom If you are interested in any of his services which ranges from phone hacking to social media account hacking, he will definitely deliver a perfect and swift service for you. I recommend Ben as the best option now because he is fast and reliable. I promised him to share his reputation across all online platforms for getting this done for me at last cause his services are untraceable and efficient not like those fakers I met previously.



It is unfortunate the insult comic aspect is so prominent in how people remember Dijkstra.


Thankfully URL surgery into this archive allows people who want to remember more than a soundbite to do so.


Need help on how to track and catch a cheating spouse. Write Webhubghost (@) gmailcom for help. He offers other hacking services




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: