
Modern computing: A short history and a shorter rant - dpkendal
http://dpk.io/moderncomputers
======
billyjobob
_Programming is easy, after all: all you need to understand is conditions and
repetition._

If you believe that you have never tried to teach a class of non-programmers.

 _Instead, programming keeps getting harder on new computers all the time,
especially ones made by Apple_

Apple have made a lot of efforts to bring programming to the common user:
Hypercard, Applescript, Automator. They are abandoned because no-one was
interested in using them.

~~~
dpkendal
“If you believe that you have never tried to teach a class of non-
programmers.”

No, you’re trying to teach them with the same kind of tool that I call ‘clumsy
and unsophisticated’ in the article.

“Apple have made a lot of efforts to bring programming to the common user:
Hypercard, Applescript, Automator. They are abandoned because no-one was
interested in using them.”

In many ways HyperCard was _still_ too difficult, but it was a great tool.
Many people _did_ use it to make simple programs, and some became programmers
from it. I’m not arguing that everyone should learn to program to the level of
being able to create and sell apps; HyperCard is a good example in that sense.

AppleScript is a failure because it’s a terribly difficult language even for
most working programmers. Its designers completely failed in that respect.

Automator is not programming.

~~~
Double_Cast
> No, you’re trying to teach them with the same kind of tool that I call
> ‘clumsy and unsophisticated’ in the article.

Despite the vitriol you've received, I think I understand what you're trying
to say here. I think you want a higher level programming language. And not
merely a successor to the latest zeitgeist, but essentially what C was to
punch tape. That is, binary might be fundamental to computation, but it isn't
fundamental to the ergonomics of programming. So the question is, is it
possible to somehow abstract programming to an higher level?

pg wrote that all other language have evolved towards lisp. I think the trend
is a special case of a trend towards functional programming. So imho, whatever
comes next (given there is a next) will have to (for lack of a better word, )
supersede lambda calculus.

~~~
dpkendal
You seem to be the only person in this thread who has grasped this idea;
though instinctively to me it feels another formalism like LC is the wrong way
to go.

~~~
Double_Cast
I'm new to programming, so I don't have much street cred. But I've heard
others call LC "the assembly language of math"[1]. And I think functionals are
really neat. So imho, I suspect LC's more than a formalism. But it drives me
up the wall that it doesn't have an elegant subtraction analog. This leads me
to believe there must be something better.

The answer sure can't be the Apple thing you mentioned. I imagine it's a just
a gui, a kludge. But if not LC, then what do _you_ think?

[1] [http://matt.might.net/articles/compiling-up-to-lambda-
calcul...](http://matt.might.net/articles/compiling-up-to-lambda-calculus/)

------
alanctgardner2
People seem to think the difficulty in coding is the mechanical act of writing
code. "If only you could draw code!", they cry, linking together a series of
blocks to represent program flow. The thing is, even if you make a language
where it's impossible to fail (every program does something, it's visually
intuitive when a program isn't valid), it still doesn't necessarily do _the
right thing_.

The difficulty in programming isn't the mechanical act of writing code; we
have copy-paste for that. The problem is having a mental model of the program
execution, and mapping it to the real-world problem. This is the difference
between good and bad programmers, and it applies to the population at large.

Case in point: I'm working with some soon-to-be grads from a CS program. In
Java, one of them instantiated a class, and set an attribute of that instance.
Elsewhere they instantiated a new instance, and tried to read the value back.
And they couldn't understand why the value wasn't set, Portal-style, in their
new instance. This guy, who has been in school for 4 years for CS, also
couldn't figure out why it made sense to make specific shapes (circle, square,
triangle) children of an abstract Shape class. He could, very easily, add a
Search bar to an Android app because he had seen a tutorial on how to do it -
but making the bar do anything was a tremendous feat, because it wasn't just
the mechanical repetition of some pattern.

Even with experienced (>5 years) programmers, I've seen some really terrible
debugging where it was clear the person didn't have a mental model of the
code. They never really got any benefit from GDB, because when they looked at
the internal state of the program they just shrugged and said "Yeah, looks
right". And they owned the codebase. This was their code, and they couldn't
reason about it.

After all that ranting, my point is: you can make it easier to write _some_
program, but you can't make it (appreciably) easier to write the program that
you need.

~~~
williamcotton
> And they couldn't understand why the value wasn't set, Portal-style, in
> their new instance.

Is it possible that these students might have other types of intelligence?

I've got friends who are very strong visually and they thrive in visual
environments such as MaxMSP and Quartz Composer. They make amazing interactive
art pieces. Yet, they seem completely baffled by general purpose programming
and symbolic logic. The thing is, you can talk to them and they can absolutely
reason about things. They're not incapable of logic, rather a certain
expression of logic.

What worries me is that we've created a negative feedback loop. The tools
we've created for computing are heavily dependent on symbolic logic. This
attracts people who are gifted with symbolic logic. They in turn create more
tools that work best for people excel at symbolic logic. And so on and so
forth.

I don't think this is an issue with people being incapable of creating mental
models. A gymnast has an amazing mental model of their body and it's relation
to space and the things in that space. However, we have yet to create
computing tools for interactive design that build on top of that mental model.
Why is that? Because we engineers have some of the WORST mental models of how
our bodies flow through space.

I think this is all mainly an issue with listening, understanding and
compassion, traits that seem to be stunted in the software industry. Engineers
seem to have a certain predilection to talk over people. We seem to always be
waiting for the other person to stop talking.

And I'll take the poetic license one step further and say that our entire
industry is incapable of listening. It is always "hey, lets set up coding
camps"! What about, "hey, let's set up a symposium where us engineers listen
to other people about how they live their lives and what they think!"?

~~~
alanctgardner2
> Engineers seem to have a certain predilection to talk over people

That's overly general: a lot of programmers may be socially inept, but it's
neither necessary nor sufficient. Both the examples from above are pretty
anti-social, but they also don't happen to have these skills. Likewise, I know
some very good programmers who are also good at requirements gathering and
collecting domain knowledge, which entails learning from experts and listening
to them.

> The tools we've created for computing are heavily dependent on symbolic
> logic.

The whole notion of computers, for better or worse, depends on symbolic logic.
You'll have a hell of a time building a microprocessor whose instruction set
is paintings or dance moves. The best case is that we build an interface from
symbolic logic to this new, more approachable paradigm.

> They make amazing interactive art pieces.

How much logic can you put in an interactive art project? Can it model
anything in the real world? The biggest problem with other paradigms is
density: if you want to program via something other than symbolic logic, get
ready for incredible fatigue as you try to turn a 10K LOC program into a 500MP
painting. Or an 18 hour long dance.

I guess I should clarify: I don't think non-programmers are lesser beings, or
that they can't model _anything_ in their heads. But for the sake of
programming, the only thing that matters is if you can model a computer. If
you have other types of intelligence, that's fine, just know you're going to
have a tough go of it when it comes to understanding the code you write.

------
scott_s
I find it strange that some people see computers-as-appliances as an
existential threat. (I'm thinking of things like tablets, smart phones and
consoles.) Yes, it is difficult to tinker with the insides of such things. But
we gain stability and ease of use. Such devices do not magically make other
kinds of computers cease to exist.

Computers that can be tinkered with still do exist, and it has never been
easier to use them to program. Microsoft and Apple both provide free access to
rich development environments. And it's easy to install many other kinds of
programming environments onto such machines.

In fact, the combination of a web browser and JavaScript is perhaps the most
ubiquitous programming environment, ever. And it is available on damned near
everything.

~~~
wrongc0ntinent
In my experience, coding usually starts out for two main reasons:
curiosity/fascination, and need. Over time, that ratio has changed, which I
guess is the main gripe of the article.

------
discreditable
It's not that programming is extremely difficult, it's that many people are
barely functional on a computer to begin with.

------
qznc
I am torn on this. On the one hand, programming is empowering, so everybody
should learn some programming in school. On the other hand, a powerful tool is
also dangerous. Look at the financial sector using complex Excel sheets nobody
understands anymore. As the world gets more and more dependent on software, we
need more professionalism in software engineering.

~~~
fenollp
Complex ¬= Powerful. Simple is Powerful.

> we need more professionalism in software engineering.

I completely agree. I think we need courses on The Mindset of Coding, teaching
things like some of Bret Victor's principles, KISS, the UNIX philosophy,
reverse engineering, …

~~~
Double_Cast
What does "powerful" even mean?

Each time I open an article on HN describing the virtues of the latest
programming language, I always see the word "powerful". Whether a language is
Assembly or Python, the author can guarantee it's "powerful". I think it's
devolved into a buzzword because programmers use it to mean opposite things.
It's like how Orwell said two critics can describe the same painting as
possessing both "a living quality" and "a peculiar deadliness" [1].

Like I said in another comment, programmers were able to abstract binary away
from the tangibility of punch tape. But I think we can only abstract so much
before we begin to hit a wall, beyond of which we begin to lose absolutely
essential features. So once we reach such a point, I think complexity becomes
conserved.

So if we want to simplify one thing, the best we can often do is move the
complexity elsewhere. Like a what a refrigerator does with heat. When
programmers say a low level language like C is simple and powerful, they mean
that the implementation is simple, but the interface is complex [2]. But when
programmers say that a high level language like Python is simple and powerful,
they mean that the implementation is complex, but the interface is simple.

So I think the real question is, "Where is the complexity hiding?"

[1]
[https://www.mtholyoke.edu/acad/intrel/orwell46.htm](https://www.mtholyoke.edu/acad/intrel/orwell46.htm)

[2] [http://www.jwz.org/doc/worse-is-
better.html](http://www.jwz.org/doc/worse-is-better.html)

~~~
scott_s
Paul Graham has an essay on the subject:
[http://www.paulgraham.com/power.html](http://www.paulgraham.com/power.html)

~~~
Double_Cast
Yes, I'm familiar with Paul Graham's essay. But while the topics we address
are related, I think they're distinct in their own right. pg is arguing that
succinctness (the "simple interface" camp) should be the goal to which all
programming languages aspire. I'm complaining that programmers use the word
"powerful" to describe things which are, not only dissimilar, but outright
contradictory! pg's essay is _prescriptive_ ; my comment is _descriptive_.

I also think my links are sufficient. Orwell discusses sloppy diction. "Worse
is Better" discusses complex implementation. pg's essay primarily addresses
interfaces. If expressing myself had been as easy as citing pg's essay, I
would have done just that.

~~~
scott_s
I have no way of knowing if you knew about Paul Graham's essay or not; it
seemed relevant, so I pointed it out.

I'm afraid I'm not quite up to getting into your comments on C (late on a
Friday night, if you'll forgive me that). But regarding Graham's essay, I
don't think it's about interfaces. It's about abstractions. If I have powerful
abstractions, I can do a lot by saying a little - and that's not because of
the _interface_ to these powerful things, but because of the meaning of the
thing behind that interface.

~~~
Double_Cast
> _regarding Graham 's essay, I don't think it's about interfaces. It's about
> abstractions. _

Hm... I disagree. would argue that the interface determines the level of
abstraction. Almost so much that they might as well be referred to as the same
thing, in colloquial contexts.

According to Wikipedia [1], "abstraction" is a way of deriving concepts from
tangible objects. According to Paul Graham [2], McCarthy had the insight that
variables are effectively pointers. To paraphrase a Zen Koan [3], "The wise
man looks at the moon, the fool looks at the finger".

What I'm hinting at is, I think of the entire English lexicon in terms of
pointers. Words point to a subset of (real or imagined) particulars we
encounter in our experiences. Programming is pointing to a task we want to
accomplish via instructions. So when pg talks about "succinctness and
abstraction", the list of pointers becomes shorter. But the things they point
to remain the same. If a lisp source file and an assembly source file
accomplish the same task given the same parameters, then it's the pointers
which have changed, not what they point to.

The programming interface affects abstraction. Whether it's an IDE, or a
notepad document -- perl or punch tape, the interface affects abstraction.
Think about the interface as a medium of expression. it shapes not only the
length of the code, but also how the programmer attacks the problem space.
E.g. suppose you sketched the Eiffel Tower in pen on cloth, and I carved the
Eiffel Tower in marble. Both our works point to the same object. They also
have their unique nuances because some of the details of the real tower are
lost due to the limitations of the medium. To quote pg [4], "You need that
resistance [a friend to bounce ideas off of], just as a carver needs the
resistance of the wood."

If abstraction (regarding programming) is about how "instructions to perform a
task" are conveyed, then implementation is about technique of performing the
conveyed task. Our instructions can point to a specific implementation (per
low level languages). But I think we agree they're not the same thing since
delegating a task is obviously distinct from specifying how to accomplish it.
In my previous comments, I linked to an article which described
implementation. When it mentioned "implementation", it meant it in the sense
of how the CPU handles interrupts.

pg's essay talks about how fast and easy it is to write code, which is
derivative of interface. But does he mention how high-level languages like
lisp often execute at a snail's pace given our society's current processing
power? He does in another essay, but not in the particular essay you cited. In
"Succinctness is Power", _pg only addresses one half of the "conservation of
complexity equation"_. I admit it could have been helpful. But I accomplished
more by citing other links. Therefore, it would have been redundant.

P.S.

Furthermore, pg is advocating for a higher level abstraction. But in my Orwell
link, Orwell complains that words are often vague and overgeneralized. I.e.
he's arguing for a lower level of abstraction. Therefore, citing pg would have
hurt my case. It's not that I'm trying to cherry pick. I recognize that
everything has a time and place, including high lvl and low lvl abstraction.
But in the case of addressing the meme " _all_ languages are powerful!",
"Succinctness is Power" was definitely not the right citation for the job.

[1]
[http://en.wikipedia.org/wiki/Abstraction](http://en.wikipedia.org/wiki/Abstraction)

[2] [http://www.paulgraham.com/diff.html](http://www.paulgraham.com/diff.html)

[3] [http://www.fifth-ape.com/blog/2012/3/27/the-finger-and-
the-m...](http://www.fifth-ape.com/blog/2012/3/27/the-finger-and-the-
moon.html)

[4] [http://paulgraham.com/ideas.html](http://paulgraham.com/ideas.html)

------
drcube
I dislike the distinction between _using_ a computer and _programming_ one.
Both are just ways to tell a machine what to do. He tells his computer what to
do using C, she tells her computer what to do using bash, and they tell their
computer what to do using Word and Firefox. Certainly C, bash, Word and
Firefox are all programs. So is an assembler and linker.

I don't know if we should treat all applications like domain specific
languages, or if we should just hide complexity until needed, then provide a
way for more advanced users to access lower abstraction layers, or what.
Either way, I don't think this fairly strict but artificial distinction
between using and programming is helpful or beneficial.

------
CmonDev
Programming is as easy as painting: either your brain is wired correctly or
you work hard to re-wire it.

------
ChuckMcM
It is all about markets, there is a huge market for selling an appliance which
has 'Apps' which can entertain you, inform you, and help you remember stuff,
there is a much much smaller market for selling a device you can program to do
interesting things.

Most (and by that imprecise measure I mean > 50% but less than 75%) of the
people who own an Apple "computer" have no interest in programming anything.
Nearly all iPad owners are not interested in writing iOS programs, and easily
99% of iPhone owners could care less about writing code for them.

That's ok, but it means there continues to be opportunities to sell
programmer's cool gear. The down side is that programmers can't always
leverage the benefits on price that mass production brings for their tools.

------
nsxwolf
To what end? Should people be writing their own email clients?

I don't know many people who would modify or extend the software they use,
even if it were really "easy" to do so. Also, I doubt such tasks could ever
really be "easy", even if the arcane syntax of a C style language were not an
obstacle.

I'm not seeing the problem. Regular people were forced to use DOS and Windows
machines in the early days, and they struggled with their complexity. Now we
have a class of computers that regular people and technical people alike enjoy
using. That's not a win?

~~~
sedev
A toy project I did not too long ago was "keep an eye on my Twitter account
and when I favorite a tweet that contains a YouTube URL, add that video to my
Watch Later playlist in YouTube." Nearly every time I mention this to someone
who uses both services, they want it (yes, I know Instapaper can de facto do
this).

~~~
nsxwolf
Right. You're a software developer, and your friends saw what you did and want
it, and you'll give it to them, and they'll be happier.

They don't have the skills needed to do it themselves, aren't interested in
acquiring the skills, but appreciate your end product and want to use it.

I think this is just fine. I don't really want to put in the incredible effort
required to learn to play the piano well, so I'm glad other people have put in
that effort and I'll listen to them play instead. Would I learn if someone
made it a lot easier? Probably. But would that make me a musician? Could I
make inspired changes and extensions to existing works? Or wholly original
works? No way!

------
protomyth
Taking Apple to task is pretty sad give OS X programming is basically NeXT's
environment which was advertised as one of the easiest computers to program
(see their Sun vs NeXT videos). Not to mention HyperCard which hit a nerve
with a lot of people.

If you want an actual example of WTF programming, do something you did in the
90's in Visual Basic on the web. The amount of infrastructure and types of
things you need to know how to program is daunting. It is still easier to
program an app on NeXTSTEP at the time of the webs creation than on the web
today.

------
talles
This is an eternal discussion between ease of use VS flexibility.

I think we have room for all of it. Oversimplifying things, as an example, I
love how Linux is extremely flexible, Mac extremely easy and Windows... well,
let's say it's in the middle.

I believe sometimes someone got it 'more right' than others achieving a right
on the spot balance. But that doesn't invalidate the other approaches.

You got to dance accordingly to the music.

------
thyrsus
I'd appreciate some examples of tools that make programming "easy". Easier
than assembler? OK, check. Easy in general? I want that.

~~~
dpkendal
Bret Victor is working on some amazing stuff right now. I suggest his talk
Drawing Dynamic Visualizations to see how he’s making programing more direct
and less “code”-ish: [http://vimeo.com/66085662](http://vimeo.com/66085662)

------
jere
>They could make computers do everything for them... helping them select the
best photo to show to potential dates.

Hmm. What are those tweets about anyway:

>Considering writing some software to use MTurk to rank my self-portraits.

It's kind of funny that out of two examples given to demonstrate the utility
of programming a computer, one is actually just asking people for their
opinion.

------
zwieback
Horses for courses - I've been programming since the pre-GUI era and much as I
love modern IDEs and advanced interfaces I still find text entry of
traditional programming language code via keyboard incredibly expressive for
many things.

------
aestra
Programming is easy to the programmers.

Art is easy to the artist.

Writing is easy to the writer.

Music is easy to the musician.

~~~
dpkendal
I am a programmer, writer and musician, but I don’t find _any_ of them easy.

Programming is unique, though, because the tools you use to do it can also be
used to make the task in general much easier.

~~~
aestra
Should have explained. Some people find some tasks come natural to them so
they think they are inherently easy for everyone. while others find other
tasks come naturally to them. Humans are different. Music is more or less
impossible for me. I am sure programming is impossible for someone else. Math
is logical and easy to me. I used to tutor my friends after school on the
homework that was assigned thinking "how us this not not completely
straightforward to you, this is simple." yet they thought things I couldn't
grasp were simple.

Lets not ignore differences in humans and say programming is easy because it
isn't to a large number of people.

[http://arstechnica.com/information-technology/2012/09/is-
it-...](http://arstechnica.com/information-technology/2012/09/is-it-true-that-
not-everyone-can-be-a-programmer/)

------
dsego
Oh my, if only apple wasn't around, folks would spend their days happily
typing code in vim or emacs.

~~~
JasonFruit
Honestly, I think there's some truth to that. If it weren't for the Macintosh,
the WIMP approach to computing might never have become as dominant, and text-
based systems might be more prevalent; I agree with the article's assertion
that text-based programs are simpler to conceive and write.

~~~
nsxwolf
No one ever believes this when it's said, but it's true. WYSIWYG took a long
time to catch on. Wordperfect for DOS was still pretty popular in 1995.

People just think it "makes sense" to work directly with various fonts and
sizes when writing on a page, but that's actually pretty unnatural. No one
ever worked that way before - you'd bang out a draft on a typewriter and hand
it to a typesetter who would take care of that part.

It took at least a decade for people to really accept it in earnest.

