But there’s almost a ‘choice paralysis’ today. People have so many options they don’t know where to start.
Not sure anything comes close to opening up BASIC or HyperCard and just making something, and seeing the results immediately.
Agreed on the fundamentals — as a mostly self taught programmer, it took me a long time to learn and understand the power of computer science concepts.
It's also interesting to dig into the author's name - apparently a half century ago he had some reputation in tech circles, but as far as I can tell he's mostly forgotten today.
He sadly seems to have passed a couple years ago:
Just a nitpick, but civil engineering is at least two orders of magnitude older since it goes back to Babylon.
The really cool thing about programming's scientific maturity is that it's entirely constructed. We know all the ground rules because we created them. The engineering challenge is not making a mess of things despite having a potentially perfect understanding of program semantics. So despite building aqueducts being a couple order of magnitudes older than building programs, we actually understand the abstract rules of programming better than we do hydrodynamics.
True, a lot of these arguments have been beaten to death but times do change though. Every now and then some fundamental assumption on which arguments are underpinned changes. Often times these changes come from other industries. It's worth re-assessing the basics from time to time.
But public clouds, for example, aren't really like histrical timesharing. Because the underlying tech, capabilities, and demand are so different that things really are different this time.
For a lot of people trying to just accomplish some specific goal, learning to program in C (as per the article) is probably not the best approach unless they're into OS kernels or embedded programming. Instead, they might well be better off stitching together some cloud services of various types. Not everyone has as an objective passing a leetcode whiteboarding interview at some ad tech company.
But a language is tied to its execution context and semantics. This leads to either dividing up languages into "natlangs" and "conlangs" depending on usage patterns and style, or to studying programming solely from the systems perspective and ignoring linguistics altogether.
I wonder how things would have been different had we, as a community, rejected this terminology and stance. What if, even further, we had rejected the idea that computing can be made "simple" or "intuitive" or "mainstream", and instead forced folks to learn programming to APIs in order to even use computers.
There's no justification for it. Maybe 20% of the population - at best - is even capable of that kind of programming.  Most people simply don't do symbolic abstraction at that level, and forcing them to try would create resentment, not literacy.
And "conlangs" are indeed different to "natlangs." There's definitely a case to be made for teaching everyone at least one extra language. But the kind of abstract thinking required for conlangs is adequately covered by basic STEM.
There might be a case for some very basic experience with programming in schools. But expecting the entire population to be able to do it at a professional level makes no more sense than expecting the entire population to have the same skills as qualified doctors, lawyers, architects, or pilots.
 There are fewer than 30 million developers globally, out of a population of nearly 8 billion.
Everyone shouldn't feel intimidated by trying to change their headlights though. You don't need to program at a professional level or have professional tools to "program" an excel spreadsheet to handle your monthly budget or to write a shell script that searches your photos folder for new files to copy to a back up drive periodically.
If one wants to pay for convenience, OK, but I do think there's real value in equipping the average person with more than some very basic experience with programming.
The high school (mid 2010s) I went to somewhat recently had ZERO programming classes and about 4 AP classes. Middle schools had basic typing classes. I am unaware if this is changing at a rapid pace but I would hope we could improve computer literacy by making kids take a few basics classes about things they use every day.
I did not see a reference to the Classics, Greek, or Latin in the article…
> In fact, though Cicero could never compete with computer games when it comes to ''making learning fun,'' conquering the conjugations of his lost tongue probably makes a lot more sense when it comes to learning to learn than sifting through GOTO statements in Basic, unrelated to our living language.
Adding on, picking on this particular phrase "when it comes to learning to learn": I took Latin and Old English in college, two indispensable (/s) courses for my life (they were actually fun for me, but of greatly limited utility). The main thing I did learn in both (especially as languages and the classics were not my field of study) was how to study. Those were the courses where I finally picked up using flashcards properly, making good notes, etc. just out of necessity (versus my math and CS courses which were, generally, "easy" for me without much effort, I took to them more naturally).
Cicero wrote in Latin, and is generally considered one of the most influential writers to do so. He also was strongly trained in the Greek/Hellenistic traditions and copied many ideas from there into Latin. I think that the author is exhorting the reader to learn Latin rather than Basic, and more generally to learn the classics rather than modern mathematics. For what it's worth, I think we need to study both; we need more Pirsigs, Hofstadters, and Carrolls, who have studied both classical philosophy and also modern computer science.
Thanks for pointing to Cicero as one of the Classics. I know this but completely glossed over the connection, too busy thinking about the then-nascent software industry.
Morlocks might want to learn programming, not because it's useful for eating Eloi, but because they're the sort of people for whom "purchasing an automobile for a cross-country trip [and] first [studying] cartography, then [proceeding] to obtain aerial and satellite photographs of the proposed route, and finally [drawing] a detailed map for the whole journey" sounds like a brilliant yak shave.
(Sandberg-Diment has left out the parts where obtaining the satellite photographs involves SDR hacking into downlink telemetry and drawing the detailed map first requires implementing a geometric-algebra based direct-to-framebuffer renderer)
My mother purchased my first computer (an Apple ][e from the local Macy's) in 1983. We didn't have much money to purchase software. I didn't even understand there was a software industry, let alone where I might purchase it.
But that computer came furnished with some basic software that allowed one to write BASIC and, also, to use a mouse with a paint program (yes, before Macintosh debuted).
While his main point that not everyone need, nor should, learn to program computers, what Sandberg-Diment misses is the sheer size of the burgeoning home computer market and how the personal computer would revolutionize and fundamentally alter the world.
Reading "Personal computers: does everyone need to learn programming?" is slightly shocking for me me because having lived in that world, the difference between what Sandberg-Diment casually suggests and its real-world manifestation could have never been forecast.
> First, it allows you to develop software that is not available commercially, and in some cases it lets you customize purchased software to serve your specific needs better.
The ability to modify software "to serve your specific needs better" is a general gesture to client-side scripting, software consulting, and even FOSS. Linux did not exist in 1984 (Torvalds was 15 years old and his magnum opus was still 6 years away). Empires can (and did) fit in the gap between Sandberg-Diment's practical observation and the real-world consequences of software customizability.
> But does this mean that whoever wants to use a computer must also write the software for it? Would someone purchasing an automobile for a cross-country trip first study cartography, then proceed to obtain aerial and satellite photographs of the proposed route, and finally draw a detailed map for the whole journey? Hardly. It is far easier to go to the A.A.A. and get standard maps or that organization's special trip sheets.
How could anyone have known that a scant 30 years later (2010s) that people could have a pocket-sized computer which (for the most part) would obviate the use of paper maps for navigating to unknown destinations? That entire industries supporting the production of paper maps would be dramatically scaled back because of a globally-connected infrastructure involving microprocessor manufacturing, interface design, wireless communication, and (literal) rocket science would be publicly available to nearly all comers?
Sandberg-Diment's practical answer to "does everyone need to learn programming?" is comforting, persuasive, and correct. But the impractical answer--everyone should consider learning programming-- would be to catch a glimpse of the future and the massive transformations that widely available computing would bring inside a generation.