Hacker News new | past | comments | ask | show | jobs | submit login
Personal computers: does everyone need to learn programming? (1984) (nytimes.com)
56 points by GuiA 38 days ago | hide | past | favorite | 25 comments



Writing software is easier in 2020 than it was when I started in 2002. Coding education is more accessible, Coding platforms are more inviting, languages are more forgiving, IDEs are more helpful, we have millions of answers on stackoverflow, and there are open-source libraries that have probably implemented whatever irks you. That being said, for the professional, I have found no substitute for the academic rigor of a good CS curriculum to shape how we think about organizing and engineering code well.


So, maybe for a professional it’s easier. I feel that way about making iOS apps — better time to start than ever!

But there’s almost a ‘choice paralysis’ today. People have so many options they don’t know where to start.

Not sure anything comes close to opening up BASIC or HyperCard and just making something, and seeing the results immediately.

Agreed on the fundamentals — as a mostly self taught programmer, it took me a long time to learn and understand the power of computer science concepts.


Definitely agree there is a choice paralysis, especially among web technologies. I'm thankful that Apple iOS usually only has one way to do most common things. I think that kind of leadership is helpful when the alternative is choice paralysis. The flipside is that it constrains those who have a legitimate reason to do it differently.


I originally posted this (in 2013! reposted today prompted by HN's second chance pool) because it struck me how, with very few modifications, this exact article could be republished today. I find it fascinating that we can be having the same arguments that people a half century ago were having, with little to no awareness that we're repeating the exact same things. It makes me realize that perhaps software is not as young a field as we like to sometimes pretend (it's common to read on HN that e.g. software is so young and immature compared to civil or electrical engineering, etc)

It's also interesting to dig into the author's name - apparently a half century ago he had some reputation in tech circles, but as far as I can tell he's mostly forgotten today.

https://www.theatlantic.com/technology/archive/2016/05/what-...

He sadly seems to have passed a couple years ago:

https://www.legacy.com/obituaries/name/erik-sandberg-diment-...


> it's common to read on HN that e.g. software is so young and immature compared to civil or electrical engineering

Just a nitpick, but civil engineering is at least two orders of magnitude older since it goes back to Babylon.

The really cool thing about programming's scientific maturity is that it's entirely constructed. We know all the ground rules because we created them. The engineering challenge is not making a mess of things despite having a potentially perfect understanding of program semantics. So despite building aqueducts being a couple order of magnitudes older than building programs, we actually understand the abstract rules of programming better than we do hydrodynamics.


> I find it fascinating that we can be having the same arguments that people a half century ago were having, with little to no awareness that we're repeating the exact same things. It makes me realize that perhaps software is not as young a field as we like to sometimes pretend (it's common to read on HN that e.g. software is so young and immature compared to civil or electrical engineering, etc)

True, a lot of these arguments have been beaten to death but times do change though. Every now and then some fundamental assumption on which arguments are underpinned changes. Often times these changes come from other industries. It's worth re-assessing the basics from time to time.


On the one hand, it's easy to dismiss any new proposal with the argument that "we've heard this 20 times before and it's never worked" and you'll usually be right.

But public clouds, for example, aren't really like histrical timesharing. Because the underlying tech, capabilities, and demand are so different that things really are different this time.


One of the big differences between now and then is network capacity. Once we've gone past basic text and into 3D, Photo, Video, Timesharing could not work effectively over the internet at the time, the bandwidth wasn't there yet. The only way to continue was to bring the hardware home. Now that we have the network capacity to handle almost anything, we're seeing things go back..


I am not sure why networking and could computing is in this thread. This is certainly the last thing you want to try or use (except for Internet access and all its learning resources) when learning to program. And it won't let you become as familiar with common computers logic as programming, say, a Tetris.


It depends on what your objective is and what you're trying to accomplish.

For a lot of people trying to just accomplish some specific goal, learning to program in C (as per the article) is probably not the best approach unless they're into OS kernels or embedded programming. Instead, they might well be better off stitching together some cloud services of various types. Not everyone has as an objective passing a leetcode whiteboarding interview at some ad tech company.


Yes, connectivity is a huge difference. On the other hand, as there's more and more data-intensive activity happening outside the data center, we're actually seeing a general trend away from everything happening "in the cloud" as was being promoted in the late 2000s. See e.g. Nick Carr's The Big Switch.


I agree that we tend to forget previous lessons learned and to rediscover decades old ideas as innovation. But on the other hand I can't think of any engineering discipline where the fundamentals have been changing as rapidly for such long time. I am pretty sure civil engineering would be thrown into some confusion too if concrete doubled its strength every two years for decades making old bridge designs obsolete.


Small nitpick: 1984 was significantly less than a half century ago. But otherwise yes, the greater point is history tends to teach us nothing.


Having been born in 1984, I'm definitely not ready to be 50 just yet.


I like the bit at the end about studying the classics in Latin and Greek. It helps me see that "language" is not quite the right word for programming systems, and that this wrong word choice has led to writers falsely thinking that learning to program is like learning a second language.

But a language is tied to its execution context and semantics. This leads to either dividing up languages into "natlangs" and "conlangs" depending on usage patterns and style, or to studying programming solely from the systems perspective and ignoring linguistics altogether.

I wonder how things would have been different had we, as a community, rejected this terminology and stance. What if, even further, we had rejected the idea that computing can be made "simple" or "intuitive" or "mainstream", and instead forced folks to learn programming to APIs in order to even use computers.


This is the "I like tinkering with cars therefore everyone should be a mechanic" argument.

There's no justification for it. Maybe 20% of the population - at best - is even capable of that kind of programming. [1] Most people simply don't do symbolic abstraction at that level, and forcing them to try would create resentment, not literacy.

And "conlangs" are indeed different to "natlangs." There's definitely a case to be made for teaching everyone at least one extra language. But the kind of abstract thinking required for conlangs is adequately covered by basic STEM.

There might be a case for some very basic experience with programming in schools. But expecting the entire population to be able to do it at a professional level makes no more sense than expecting the entire population to have the same skills as qualified doctors, lawyers, architects, or pilots.

[1] There are fewer than 30 million developers globally, out of a population of nearly 8 billion.


This is the "I like tinkering with cars therefore everyone should be a mechanic" argument.

Everyone shouldn't feel intimidated by trying to change their headlights though. You don't need to program at a professional level or have professional tools to "program" an excel spreadsheet to handle your monthly budget or to write a shell script that searches your photos folder for new files to copy to a back up drive periodically.

If one wants to pay for convenience, OK, but I do think there's real value in equipping the average person with more than some very basic experience with programming.


Yes, and to expand upon the magic word you used: Excel claimed 30mil users in the mid-1990s [0] and various estimates I've read put the current usage at somewhere between 500-600mil users. I think Excel is the most popular domain-specific programming language on the planet.

[0] https://news.microsoft.com/1996/05/20/more-than-30-million-u...


> There might be a case for some very basic experience with programming in schools.

The high school (mid 2010s) I went to somewhat recently had ZERO programming classes and about 4 AP classes. Middle schools had basic typing classes. I am unaware if this is changing at a rapid pace but I would hope we could improve computer literacy by making kids take a few basics classes about things they use every day.


> I like the bit at the end about studying the classics in Latin and Greek.

I did not see a reference to the Classics, Greek, or Latin in the article…


It wasn't explicit, you had to read into this statement (that studying Cicero was studying Latin):

> In fact, though Cicero could never compete with computer games when it comes to ''making learning fun,'' conquering the conjugations of his lost tongue probably makes a lot more sense when it comes to learning to learn than sifting through GOTO statements in Basic, unrelated to our living language.

Adding on, picking on this particular phrase "when it comes to learning to learn": I took Latin and Old English in college, two indispensable (/s) courses for my life (they were actually fun for me, but of greatly limited utility). The main thing I did learn in both (especially as languages and the classics were not my field of study) was how to study. Those were the courses where I finally picked up using flashcards properly, making good notes, etc. just out of necessity (versus my math and CS courses which were, generally, "easy" for me without much effort, I took to them more naturally).


Quoting the article:

> In fact, though Cicero could never compete with computer games when it comes to ''making learning fun,'' conquering the conjugations of his lost tongue probably makes a lot more sense when it comes to learning to learn than sifting through GOTO statements in Basic, unrelated to our living language.

Cicero wrote in Latin, and is generally considered one of the most influential writers to do so. He also was strongly trained in the Greek/Hellenistic traditions and copied many ideas from there into Latin. I think that the author is exhorting the reader to learn Latin rather than Basic, and more generally to learn the classics rather than modern mathematics. For what it's worth, I think we need to study both; we need more Pirsigs, Hofstadters, and Carrolls, who have studied both classical philosophy and also modern computer science.


> I think that the author is exhorting the reader to learn Latin rather than Basic, and more generally to learn the classics rather than modern mathematics. For what it's worth, I think we need to study both;

Agreed.

Thanks for pointing to Cicero as one of the Classics. I know this but completely glossed over the connection, too busy thinking about the then-nascent software industry.


Eloi don't need to learn programming: "[by] the time you became truly proficient at programming, chances are that whatever you set out to write would be available in some form from a software publisher."

Morlocks might want to learn programming, not because it's useful for eating Eloi, but because they're the sort of people for whom "purchasing an automobile for a cross-country trip [and] first [studying] cartography, then [proceeding] to obtain aerial and satellite photographs of the proposed route, and finally [drawing] a detailed map for the whole journey" sounds like a brilliant yak shave.

(Sandberg-Diment has left out the parts where obtaining the satellite photographs involves SDR hacking into downlink telemetry and drawing the detailed map first requires implementing a geometric-algebra based direct-to-framebuffer renderer)


That was fascinating!

My mother purchased my first computer (an Apple ][e from the local Macy's) in 1983. We didn't have much money to purchase software. I didn't even understand there was a software industry, let alone where I might purchase it.

But that computer came furnished with some basic software that allowed one to write BASIC and, also, to use a mouse with a paint program (yes, before Macintosh debuted).

While his main point that not everyone need, nor should, learn to program computers, what Sandberg-Diment misses is the sheer size of the burgeoning home computer market and how the personal computer would revolutionize and fundamentally alter the world.

Reading "Personal computers: does everyone need to learn programming?" is slightly shocking for me me because having lived in that world, the difference between what Sandberg-Diment casually suggests and its real-world manifestation could have never been forecast.

Two examples:

> First, it allows you to develop software that is not available commercially, and in some cases it lets you customize purchased software to serve your specific needs better.

The ability to modify software "to serve your specific needs better" is a general gesture to client-side scripting, software consulting, and even FOSS. Linux did not exist in 1984 (Torvalds was 15 years old and his magnum opus was still 6 years away). Empires can (and did) fit in the gap between Sandberg-Diment's practical observation and the real-world consequences of software customizability.

Second:

> But does this mean that whoever wants to use a computer must also write the software for it? Would someone purchasing an automobile for a cross-country trip first study cartography, then proceed to obtain aerial and satellite photographs of the proposed route, and finally draw a detailed map for the whole journey? Hardly. It is far easier to go to the A.A.A. and get standard maps or that organization's special trip sheets.

How could anyone have known that a scant 30 years later (2010s) that people could have a pocket-sized computer which (for the most part) would obviate the use of paper maps for navigating to unknown destinations? That entire industries supporting the production of paper maps would be dramatically scaled back because of a globally-connected infrastructure involving microprocessor manufacturing, interface design, wireless communication, and (literal) rocket science would be publicly available to nearly all comers?

Sandberg-Diment's practical answer to "does everyone need to learn programming?" is comforting, persuasive, and correct. But the impractical answer--everyone should consider learning programming-- would be to catch a glimpse of the future and the massive transformations that widely available computing would bring inside a generation.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: