
Personal computers: does everyone need to learn programming? (1984) - GuiA
http://www.nytimes.com/1984/01/17/science/personal-computers-does-everyone-need-to-learn-programming.html
======
cwoolfe
Writing software is easier in 2020 than it was when I started in 2002. Coding
education is more accessible, Coding platforms are more inviting, languages
are more forgiving, IDEs are more helpful, we have millions of answers on
stackoverflow, and there are open-source libraries that have probably
implemented whatever irks you. That being said, for the professional, I have
found no substitute for the academic rigor of a good CS curriculum to shape
how we think about organizing and engineering code well.

~~~
gdubs
So, maybe for a professional it’s easier. I feel that way about making iOS
apps — better time to start than ever!

But there’s almost a ‘choice paralysis’ today. People have so many options
they don’t know where to start.

Not sure anything comes close to opening up BASIC or HyperCard and just making
something, and seeing the results immediately.

Agreed on the fundamentals — as a mostly self taught programmer, it took me a
long time to learn and understand the power of computer science concepts.

~~~
cwoolfe
Definitely agree there is a choice paralysis, especially among web
technologies. I'm thankful that Apple iOS usually only has one way to do most
common things. I think that kind of leadership is helpful when the alternative
is choice paralysis. The flipside is that it constrains those who have a
legitimate reason to do it differently.

------
GuiA
I originally posted this (in 2013! reposted today prompted by HN's second
chance pool) because it struck me how, with very few modifications, this exact
article could be republished today. I find it fascinating that we can be
having the same arguments that people a half century ago were having, with
little to no awareness that we're repeating the exact same things. It makes me
realize that perhaps software is not as young a field as we like to sometimes
pretend (it's common to read on HN that e.g. software is so young and immature
compared to civil or electrical engineering, etc)

It's also interesting to dig into the author's name - apparently a half
century ago he had some reputation in tech circles, but as far as I can tell
he's mostly forgotten today.

[https://www.theatlantic.com/technology/archive/2016/05/what-...](https://www.theatlantic.com/technology/archive/2016/05/what-
happens-when-your-tech-predictions-tank/480990/)

He sadly seems to have passed a couple years ago:

[https://www.legacy.com/obituaries/name/erik-sandberg-
diment-...](https://www.legacy.com/obituaries/name/erik-sandberg-diment-
obituary?pid=189908085)

~~~
throwaway0a5e
> I find it fascinating that we can be having the same arguments that people a
> half century ago were having, with little to no awareness that we're
> repeating the exact same things. It makes me realize that perhaps software
> is not as young a field as we like to sometimes pretend (it's common to read
> on HN that e.g. software is so young and immature compared to civil or
> electrical engineering, etc)

True, a lot of these arguments have been beaten to death but times do change
though. Every now and then some fundamental assumption on which arguments are
underpinned changes. Often times these changes come from other industries.
It's worth re-assessing the basics from time to time.

~~~
ghaff
On the one hand, it's easy to dismiss any new proposal with the argument that
"we've heard this 20 times before and it's never worked" and you'll usually be
right.

But public clouds, for example, aren't _really_ like histrical timesharing.
Because the underlying tech, capabilities, and demand are so different that
things really are different this time.

~~~
thisisnico
One of the big differences between now and then is network capacity. Once
we've gone past basic text and into 3D, Photo, Video, Timesharing could not
work effectively over the internet at the time, the bandwidth wasn't there
yet. The only way to continue was to bring the hardware home. Now that we have
the network capacity to handle almost anything, we're seeing things go back..

~~~
astrobe_
I am not sure why networking and could computing is in this thread. This is
certainly the last thing you want to try or use (except for Internet access
and all its learning resources) when learning to program. And it won't let you
become as familiar with common _computers logic_ as programming, say, a
Tetris.

~~~
ghaff
It depends on what your objective is and what you're trying to accomplish.

For a lot of people trying to just accomplish some specific goal, learning to
program in C (as per the article) is probably not the best approach unless
they're into OS kernels or embedded programming. Instead, they might well be
better off stitching together some cloud services of various types. Not
everyone has as an objective passing a leetcode whiteboarding interview at
some ad tech company.

------
Kednicma
I like the bit at the end about studying the classics in Latin and Greek. It
helps me see that "language" is not quite the right word for programming
systems, and that this wrong word choice has led to writers falsely thinking
that learning to program is like learning a second language.

But a language is tied to its execution context and semantics. This leads to
either dividing up languages into "natlangs" and "conlangs" depending on usage
patterns and style, or to studying programming solely from the systems
perspective and ignoring linguistics altogether.

I wonder how things would have been different had we, as a community, rejected
this terminology and stance. What if, even further, we had rejected the idea
that computing can be made "simple" or "intuitive" or "mainstream", and
instead forced folks to learn programming to APIs in order to even use
computers.

~~~
TheOtherHobbes
This is the "I like tinkering with cars therefore everyone should be a
mechanic" argument.

There's no justification for it. Maybe 20% of the population - at best - is
even capable of that kind of programming. [1] Most people simply don't do
symbolic abstraction at that level, and forcing them to try would create
resentment, not literacy.

And "conlangs" are indeed different to "natlangs." There's definitely a case
to be made for teaching everyone at least one extra language. But the kind of
abstract thinking required for conlangs is adequately covered by basic STEM.

There might be a case for some very basic experience with programming in
schools. But expecting the entire population to be able to do it at a
professional level makes no more sense than expecting the entire population to
have the same skills as qualified doctors, lawyers, architects, or pilots.

[1] There are fewer than 30 million developers _globally_ , out of a
population of nearly 8 billion.

~~~
dfxm12
_This is the "I like tinkering with cars therefore everyone should be a
mechanic" argument._

Everyone shouldn't feel intimidated by trying to change their headlights
though. You don't need to program at a professional level or have professional
tools to "program" an excel spreadsheet to handle your monthly budget or to
write a shell script that searches your photos folder for new files to copy to
a back up drive periodically.

If one wants to pay for convenience, OK, but I do think there's real value in
equipping the average person with more than some very basic experience with
programming.

~~~
Kednicma
Yes, and to expand upon the magic word you used: Excel claimed 30mil users in
the mid-1990s [0] and various estimates I've read put the current usage at
somewhere between 500-600mil users. I think Excel is the most popular domain-
specific programming language on the planet.

[0] [https://news.microsoft.com/1996/05/20/more-
than-30-million-u...](https://news.microsoft.com/1996/05/20/more-
than-30-million-users-make-microsoft-excel-the-worlds-most-popular-
spreadsheet-program/)

------
082349872349872
Eloi don't need to learn programming: "[by] the time you became truly
proficient at programming, chances are that whatever you set out to write
would be available in some form from a software publisher."

Morlocks might want to learn programming, not because it's useful for eating
Eloi, but because they're the sort of people for whom "purchasing an
automobile for a cross-country trip [and] first [studying] cartography, then
[proceeding] to obtain aerial and satellite photographs of the proposed route,
and finally [drawing] a detailed map for the whole journey" sounds like a
brilliant yak shave.

(Sandberg-Diment has left out the parts where obtaining the satellite
photographs involves SDR hacking into downlink telemetry and drawing the
detailed map first requires implementing a geometric-algebra based direct-to-
framebuffer renderer)

------
mistersquid
That was fascinating!

My mother purchased my first computer (an Apple ][e from the local Macy's) in
1983. We didn't have much money to purchase software. I didn't even understand
there was a software industry, let alone where I might purchase it.

But that computer came furnished with some basic software that allowed one to
write BASIC and, also, to use a mouse with a paint program (yes, before
Macintosh debuted).

While his main point that not everyone need, nor should, learn to program
computers, what Sandberg-Diment misses is the sheer size of the burgeoning
home computer market and how the personal computer would revolutionize and
fundamentally alter the world.

Reading "Personal computers: does everyone need to learn programming?" is
slightly shocking for me me because having lived in that world, the difference
between what Sandberg-Diment casually suggests and its real-world
manifestation could have never been forecast.

Two examples:

> First, it allows you to develop software that is not available commercially,
> and in some cases it lets you customize purchased software to serve your
> specific needs better.

The ability to modify software "to serve your specific needs better" is a
general gesture to client-side scripting, software consulting, and even FOSS.
Linux did not exist in 1984 (Torvalds was 15 years old and his magnum opus was
still 6 years away). Empires can (and did) fit in the gap between Sandberg-
Diment's practical observation and the real-world consequences of software
customizability.

Second:

> But does this mean that whoever wants to use a computer must also write the
> software for it? Would someone purchasing an automobile for a cross-country
> trip first study cartography, then proceed to obtain aerial and satellite
> photographs of the proposed route, and finally draw a detailed map for the
> whole journey? Hardly. It is far easier to go to the A.A.A. and get standard
> maps or that organization's special trip sheets.

How could anyone have known that a scant 30 years later (2010s) that people
could have a pocket-sized computer which (for the most part) would obviate the
use of paper maps for navigating to unknown destinations? That entire
industries supporting the production of paper maps would be dramatically
scaled back because of a globally-connected infrastructure involving
microprocessor manufacturing, interface design, wireless communication, and
(literal) rocket science would be publicly available to nearly all comers?

Sandberg-Diment's practical answer to "does everyone need to learn
programming?" is comforting, persuasive, and correct. But the impractical
answer--everyone _should consider_ learning programming-- would be to catch a
glimpse of the future and the massive transformations that widely available
computing would bring inside a generation.

