I wouldn't say I use them often (and when I do, it's usually some form of business process modeling rather than software control flow) but I don't think it's quite fair to assert that flowcharts became obsolete with punch-cards.
At they very least they are a useful "language" to describe program-like logic to non-programmers. That includes beginner programmers, as others have noted, but also includes interested non-technical parties. In my experience it is the best way to explain/document convoluted decision trees to non-programmers.
More broadly, I hear a lot of complaints about UML and other diagrammatic ways to design or describe programs. I don't really get it. I mean, I understand why one might think that the formal UML language is complicated and convoluted (because it is that) and why one might think it fails to live up to the promise of a visual way to specify/write software (because it does fail to do that), but surely I'm not the only one who just drops most of the formalism and uses a pseudo-UML format to communicate (and sometimes even reason about) software design? How do you sketch programs without some UML-like notation? (I suspect UML was partially a formalization of existing ways to sketch programs to begin with.)
Funny, the more my job involves explaining to developers what is needed rather than coding it myself, the more I tend to fall back on flowcharts and diagrams to describe the higher level business processes.
I'm all in favor of agile direct communication with the client, but one decent flowchart can save spending hours in meetings trying to get everyone to understand what the problem is we need to solve.
Although no longer needed to design every little detail of the code flow, it's still pretty much an essential skill to have in software development.
The main problem with flowcharts for programming is that flowcharts are procedural, whereas much of what we do in programming is declarative: the flowchart describes more how to do something than what has to be done (at some level).
Of course, this depends on at which level the flowchart is drawn.
Slightly off-topic, but I remember doing flow charts in college back in 2008. I hated it. Everyone hated it. It was boring, awkward, and confusing, especially for those of my classmates who had never written actual code before.
The assignment was to break down everyday activities (e.g. brushing your teeth) into atomic steps, and have conditionals and loops in there. Not only was it unclear how granular the steps were supposed to be, but I also thought that this is probably the lamest, most uncool way to introduce someone to programming.
At another college they used Scratch for that, which was a whole lot more fun, because it makes it easy to avoid syntax errors, draw graphics, get instant feedback, and have a nice visualization of the program flow/structure.
And those god-damn Nassi–Shneiderman diagrams. Ugh.
"But what will most certainly be forgotten is there was a generation of students in the 1980s and 1990s who were encouraged to use flow charts, long after the reason for using them had disappeared."
I don't know if I entirely agree with his conclusion. The smartest people I know code with a pen and paper. When they're on the bus or subway. When relaxing, or thinking carefully while away from a computer.
Those who program in the shower and then write it down after before they forget. Analogue programming tools (pan/paper) are great tools.
I thought the article was going to end with him pointing out that he found a great practice that we can all benefit from now. Planning code ahead of time is still better than typing away, we're just not forced to do it anymore.
My father is a retired EE and was programming embedded machines in assembly language before I "caught the programming bug". When I was mostly doing embedded systems, we had plenty we could talk about. That hasn't changed now that he's retired since the basis of hardware logic and software logic is the same (Boolean Algebra identities). But I've done some interesting things in software as a result ... like using Karnaugh maps to minimize complex branching.
Incidentally, I'm pushing 50 and used punch cards in my computer math class in high school as well as for my lower-level computer science classes in college (Cmp Sci 201 was Fortran at Penn State back then). My son is a second semester senior in Media Effects (currently studying what affects app store engagement) and also has a minor in IST. Computing has been around long enough for three generations to "partake".
Well, no,, she programs in COBOL...so we have nothing to talk about :). I do remember when I was having a hard time in upper level Calculus thinking that my mom had to go through the same classes to get a college degree here, except with much less English ability, so the least I could do was stick it through. But we've never talked much about programming, I think for my parents, it was just a way to get a job after moving here.
Note that Lisp, COBOL, Fortran and Algol were all invented more then 50 years ago (incidently, 3 of the 4 are still moderately common).
A programmer working in the 1970s or 1980s could easily have a child 20 to 30 years old now. I can't readily find the BLS Handbook of labor Statistics from the period online, but there must have been quite a few of them/us by the mid-1980s.
incidently, 3 of the 4 are still moderately common
I assume you're talking about Algol as the one not in moderately common. But the lisps, cobol and fortran in use today generally isn't the same as 50 years ago (maaaybe legacy cobol/fortran stuff is?) - especially the lisps used today (mostly common lisp, scheme and new lisps like clojure) are very different from what they were 50 years ago to the degree that they are really entirely different languages. I'd go so far as to say that lisp is not moderately common anymore, but that languages called Scheme, Common Lisp, Clojure etc are. Saying that Lisp is still moderately common due to these languages is no different than saying that Algol is still moderately common - or rather, Algol dialects known as C, Ada, Java, C# and so on are.
"I don’t know what language engineers will use in the future, but I know they’ll call it Fortran."
(A famous quote, but a quick Google doesn't yield a definitive attribution for it.)
Yes, Algol was what I meant as the odd one out. Your point is well taken, although I think this implies more that Algol is among the living than that modern Lisp/COBOL/Fortran are completely divorced from their first-generation ancestry. (To be clear, I read your comment as agreeing with me here, I'm just highlighting the distinction.)
Yes, I wasn't really disagreeing, but it irks me a bit how a lot of people talk about languages (usually Lisp) as if it were the exact same language that was created 60 years ago that's in use today despite that they have evolved significantly over time and yet the Algol-derived languages are treated completely independently as entirely new entities altogether. Someone mentioned this on HN a day or two ago too.
It would be wrong to say that Lisp/COBOL/Fortran are divorced from their first-generation ancestry. I think a lot of people talk about Lisp as if its still the same language because on the surface it looks like that way: the syntax is mostly still intact and the core values (conses, lists, homoiconicity, macros) are all still these, yet Scheme is still a different beast from Common Lisp, Clojure, Emacs Lisp and what Zeta Lisp was. Algol-derived languages, since they have much more complex syntax than s-expressions, have much more varied syntax and therefore look like very different languages, though they still have a lot of semantics in common with algol.
So I think what I'm saying is (at Least for Lisp and Algol - I don't know enough about COBOL and Fortran to know how different they now are from 50 years ago) in neither of these cases are the languages in use today the same languages that were in use 50 years ago, but that both families of languages have descendants in common use today which can be clearly traced to their first-generation ancestry.
My mum's ex IBM, she worked in Cobol and Fortran and taught me to program. She doesn't know much modern stuff, so I can't really talk about my daily work with her, sadly. When I was at university, in 2006 there was an "IBM Mainframe Contest" - I was using the same OS she used to use! So we had some fun conversations about that :)
My aunt was a programmer. She gave me my first computer (TRS-80) and taught me to program in the 80s. By the time I got my degree and started working as a programmer she had retired to northern BC and was no longer into technology. I owe my career to her.
Lest we forget "computer" and "compiler" were once job descriptions, and (if I'm not mistaken) the "coder" was the person who punched the cards while the "programmer" was the person who fed those cards into the card reader. Timesharing systems and teletypes changed everything.
Software developers often talk about how automation changes peoples' lives but sometimes forget how much of their own industry was disrupted by automation.
The classes were for beginner programmers. It's a good idea for them to sketch out a rough idea of their program before they start coding so they can get clear in their head things like what loops to use.
Note that I suggest an informal sketch, rather than strict flowcharts with definitions for each box shape etc.
I think that there is still value in teaching flow charts to beginner coders. It'll get you used to pen and paper reasoning and they are very conductive to better design and architecture, when done at a higher level.
Those are great pictures, but the claim that drawing flow charts saved time in the punch card era doesn't make sense to me. My bet is that, then as now, good programmers soon discovered that they were a waste, favored only by textbook writers and managers. They are a non-programmer's idea of what a program ought to look like. There have always been far more powerful tools to work with in the absence of a keyboard, like pencil and paper—not to mention just plain thinking.
There are many programmers around who worked with punch cards. I have often heard them emphasize how important it was to get your program right the first time, because compile time was so lengthy and scarce. But I've never heard one say "that's why it was important to draw a flow chart first". If they had, I'd remember; it would have been such a surprise!
Some HNers must have been programming in those days. I'd like to hear from them about this.
I recently read Weinberg's Psychology of Computer Programming, which was written at about the time that time-sharing operating systems (like Unix) began to push out batch computing systems. This is exactly the sort of thing he was concerned about.
When you have a long turnaround time for feedback, you get a strong incentive to "desk check" before you send off your work.
Thanks for posting this. It always puzzled me when I was in college; what was the usefulness of flow charts. Instructors seemed unnaturally fixated on them. Now that I am an instructor, I occasionally find them useful on the whiteboard, but that's about it.
At they very least they are a useful "language" to describe program-like logic to non-programmers. That includes beginner programmers, as others have noted, but also includes interested non-technical parties. In my experience it is the best way to explain/document convoluted decision trees to non-programmers.
More broadly, I hear a lot of complaints about UML and other diagrammatic ways to design or describe programs. I don't really get it. I mean, I understand why one might think that the formal UML language is complicated and convoluted (because it is that) and why one might think it fails to live up to the promise of a visual way to specify/write software (because it does fail to do that), but surely I'm not the only one who just drops most of the formalism and uses a pseudo-UML format to communicate (and sometimes even reason about) software design? How do you sketch programs without some UML-like notation? (I suspect UML was partially a formalization of existing ways to sketch programs to begin with.)