
The Two Cultures of Computing (2013) - kercker
http://pgbovine.net/two-cultures-of-computing.htm
======
gavinpc
Apprentice comes into the workshop and the master carpenter performs
mysterious steps to make things.

Apprentice is baffled by what appears to be difficult and says, "I was wrong
about wanting to do this, if this is what it is. I'm going to Ikea, who's with
me?"

Is it the master's job to persuade the apprentices to stay?

This "wait, come back!" reflex is understandable. But even if we could make
"conversing" with the computer _look_ easy, it would still _be_ difficult
(since, as Alan Turing himself pointed out, anything routine should be
automated). Would you feel differently if students were bolting (or merely
appalled) when they hit those inevitable, real challenges?

I completely agree with the OP that interaction has not fundamentally advanced
since the 1960's ( _edit_ , well, I'd say 70's, but still). As others have
pointed out, the examples could have been better, but it doesn't change the
point.

Improving interactivity is _our problem_ , especially since we spend the most
time with computers. And it's (still) one of the most interesting problems in
computing. I would tell students that if you're not satisfied with the state
of the art, you're not alone. And if you are satisfied with Ikea, then why are
you here?

 _edit_ moved side comment to a different post.

~~~
coldtea
> _Is it the master 's job to persuade the apprentices to stay?_

Not, but it's the master's job to understand that all of his old, hallowed and
arcane practices are not necessarily beneficial. Some are just busy-work or
cargo-cult.

And that because a tool is "tried and true" for decades, it doesn't mean it
can't be made better.

The command line might be handy for what it does, but from a UX perspective
it's a mess. Tons of things that could be done to make it better, and bring it
to 2016.

Take for example all common UNIX userland programs and change them so that
they all respond to the same flags for the same functionality.

Here -- instantly better, and we're still in the CLI realm.

Or make them understand SIGINFO and turn on verbose mode etc midway.

Color-coded output (under a switch) from all tools, not just ls, grep etc.

A common configuration format for all of them. There's nothing something like
TOML or JSON+comments couldn't handle, instead of each fucking tool having
their own bizarro format and parser, resulting in the bloody mess that is
/etc/conf.

Nor is "man" formatting the best we can do in 2016.

I'd even appreciate a "trash mode" for rm as default. Then you do something
like "rm purge" and only then it actually kills everything put in between
calls. Tons of user documents would have saved that way, instead of "just be
careful because that's how we always did things". And of course some --doitnow
flag could just delete stuff immediately.

Just a few things out of the top of my head. There are tons of other things...

Edit: of course those things would never happen, because they require central
vision and roadmap, and the unix cli userland is just a set of disparate apps
from teams who don't speak to each other.

At least FSF could impose some of that roadmap to GNU stuff, but they really
don't care, they prefer to pass their time with

~~~
initram
Not to mention, when learning programming there's absolutely no reason to
introduce the command line. Set them up with an IDE and GUI for source
control. You can still use llvm or gcc and git or svn behind the scenes, but
it's absolutely absurd to say, "Well before you can start learning to program,
you have to learn these several other domain specific languages and you have
to type them, and it has to look awful."

I think Apple's new playgrounds and the stuff they showed for kids at WWDC is
the future, personally.

~~~
wfo
Except that IDEs and GUIs are buggy. They are finicky and they fail often and
usually their failure can only be debugged by someone who understands what is
actually going on, i.e. the command line and build process. Consider teaching
a CS class. Everyone has a different machine, of a different age, with a
different operating system. Getting all of them on some posix type CLI where
you can type 'make' or 'python' is fairly easy and once it's working,
foolproof.

Want to teach two languages? Great! Install two entirely different IDEs, teach
each one all over again and their differences, or spend hours debugging weird
compiler plugin errors every single one of your students will have.

Or in 3 seconds tell them to type 'python' instead of 'gcc' once they already
understand what programming actually is.

In general I always think it's important to understand what's actually going
on before you use fancy tools that automate it for you. Because when your
fancy tools break because of some tiny misconfiguration or change you didn't
notice, and they will, throwing your hands up and saying "welp, I guess I'm
helpless" isn't good enough and you'll never be half decent, or have any kind
of real confidence in your ability until you have at least some knowledge of
what's going on under the hood. If someone wants to be a programmer, we are
teaching them to be car mechanics, not car drivers. So why not start with the
real stuff?

And if someone can't get over their fear of communicating with the computer
using words and symbols (i.e. text) instead of point-and-click then they
aren't ever going to be a programmer -- better to know and pass that hurdle
early rather than later.

~~~
coldtea
> _Except that IDEs and GUIs are buggy. They are finicky and they fail often
> and usually their failure can only be debugged by someone who understands
> what is actually going on, i.e. the command line and build process._

People having been coding years on end on modern IDEs like IntelliJ and
Eclipse and VS without them "failing" for basic stuff.

And it's not like setting up e.g. Vim or whatever with just some of the
niceties an IDE has not challenging.

> _Consider teaching a CS class. Everyone has a different machine, of a
> different age, with a different operating system. Getting all of them on
> some posix type CLI where you can type 'make' or 'python' is fairly easy and
> once it's working, foolproof._

You'd be surprised. Try getting them to run Python on Windows and discover all
the subtle changes, libs that only play on Linux etc, use the same Unix
userland with slight OS X/Linux changes, run that terminal stuff on Windows
etc.

Besides, tons of programming teaches classes, seminars, etc use or recommend
an IDE.

~~~
AstralStorm
Ha, don't even get me stated on infallibility of modern IDEs. Forgot to index?
90% functionality is off or broken. Want more than one project open? Better
have 10 GB memory on the ready. Oh, did someone customizer their key bindings?
Or window manager, and the binding now collides?

Ah wait, the integrated debugger has crashed again, so I cannot figure things
out. Hey, the thing cannot even open a text data file, because it is not part
of the project, whatever that means.

Etc. Most "industrial" IDEs male quote a few simple things way unnecessarily
complex.

------
zekevermillion
The problem I see is that it is very difficult to improve user tools when the
users (such as myself) are stuck using the wrong abstractions. I'm a corporate
lawyer, and the #1 and 2 tools of my trade are Word, followed by Acrobat.
Typical process for negotiating a contract is to save new versions in .docx
and run "legal compare" in some tool such as DeltaView or Workshare to show
change dif between versions. This snapshot dif is then emailed to the people
who need to see it. This is a relatively powerful abstraction for a limited
field, but is hard to adapt. There is no feedback mechanism between users and
developers of these tools. So they basically do not improve, and actually the
experience of using them has only gotten worse with time for everyone
involved. So the immediate benefit to switching philosophies is zero, though
the long-term benefit is huge. But how do you overcome this initial starting
friction?

~~~
swalsh
You should try putting the text into git, has all those tools built in... plus
a lot more, and no emailing.

~~~
na85
I have family who are lawyers.

They and a great many of their colleagues can barely type without looking at
the keyboard. You're suggesting that the kind of person who still double-
clicks on hyperlinks can learn git?

I'm an aerospace engineer and I can barely use git. Git's UI/UX is a fucking
disaster that just happens to suck less than what came before it.

~~~
leohutson
Git is not great UX wise, but it's probably not as big of an issue as people
like to make out, especially if you are already comfortable in a shell. The
reason people find git hard is that concepts of distributed version control
are hard.

There are plenty of nice git native GUI's and web apps, and mercurial offers a
more consistent command line UX, but that doesn't change the fact that people
need to understand a crap load of higher-level concepts before they can find
their DVCS useful. Such as:

file-system folder structures

diffs/patches

commits/uncommited changes

branching

merging and conflict resolution.

remote vs local repositories

To me it's like how people complain about mathematical notation being
complicated and inconsistent, but that's just the first superficial "alien"
looking stuff that tells you that you don't understand the new concept. Once
you understand the underlying concepts, the notation doesn't get much further
thought.

~~~
na85
>The reason people find git hard is that concepts of distributed version
control are hard.

Maybe. For me it was because the documentation is written like a comp sci
whitepaper. I've ranted about this elsewhere on HN but the man pages are
filled with stuff like "update remote refs" as if a person would have a
fucking clue what a ref is and why you'd want to update a remote one.

Branching, merging, tagging, all of that sounds great. I've been using git for
years, but the problem is that I don't use it every day, and so I still get a
little wary when I want to run a merge. I can't tell you right now off the top
of my head how to merge if there is a conflict. I always have to google around
for "how to undo a commit". Etc. etc. etc.

Where the fuck do my files go when I switch to a different branch? Do they
actually change in the working directory? Checkout vs fetch? Where exactly in
the file system is this alleged "staging area"? I have no idea about any of
these, mostly because the documentation is just so poorly written that it
takes me too long to find out the answers and I have more important things to
do than Greco-Roman wrestling matches with my software tooling, but I'm afraid
that if I just try things at random I'll do something I can't undo because
none of the options seem particularly intuitive.

It's just not a very good piece of software.

~~~
rimantas
Software is good , your mental model of it is bad. Trying to put the squre peg
of git into the round hole of filesystem is bound to produce frustration.

------
et1337
These are not the most compelling tools to begin with. I know how to use Git
and markdown, but I still use Google Docs for most things.

The key differentiating feature of programming is interactivity, and the most
interactive programs are games, which is why they're so great for teaching.
You can't make a game with Spotlight search.

I always tell people to start with HTML because it gives instant visual
feedback, all you need is notepad, and it gets you used to typing words in a
text file. Pretty soon they want interactivity, and suddenly they're learning
Javascript.

------
nxc18
The problem here isn't that there's two perspectives (in fact, there are many
more than 'two cultures of programming'), it is that you tried to teach people
programming by bombarding them with 50 new tools (git, LaTEX, shells, piping,
etc). Yes, those are extremely useful tools for engineers and serious
programmers. However, they really aren't necessary or core to programming so
much as they are core to engineering, sysadmin-ing, typesetting, etc. To be
honest, using LaTEX for your resume is just a bad decision unless you happen
to be very good at it and very incompetent with Word. Not only was the author
teaching them irrelevant tools, he was stretching them to bizarre use-cases
that conflict with how any productive human would approach an everyday task.

The essence of programming that needs to be taught is how to think about
problems. Those are harder to grasp, but really they are just different ways
of thinking. Programmatic thinking is clearly applicable to many things and
enriches your life _beyond_ just programming. Throwing tools at people is only
important after you've learned how to reason about problems and begin tackling
them.

Like 1t1337 said, HTML is a good place to start for the basic concept of
writing plaintext to control the computer. Beyond that, Python is great for
teaching programming as it has an easy, intuitive (relative) syntax and
supports many concepts without being too demanding re: types.

Don't forget that people new to programming are _really_ new to programming.
Just the concepts of _int_ , _double_ , and _string_ can be daunting because
people just don't think that way.

The biggest challenge I've seen in teaching/coaching new programmers is the
concept of assignment vs. equality and how that interacts with variables. Even
very smart (young) math majors can get very caught up on the following:

number = 5

number = 6

In math (as everyday people understand it), that would clearly be incorrect. I
suspect for math-oriented students, functional languages would be a better
place to start. I personally learned about programming variables before
algebra variables, so my experience was the opposite.

I truly believe an excellent introduction to programming course really doesn't
even require a computer to be involved, although of course when used correctly
computers and live demos add a lot of value. Rather, an excellent course will
teach the concepts (variable, typing, what memory is, functions, recursion,
iteration, and turning real-world problems into computable problems, and so
on) which are completely tool agnostic.

~~~
ScottBurson
Yes, the use of bare "=" for assignment is one of the Great Mistakes of
Computing, right up there with null references and zero-terminated strings.
There have certainly been languages that have used other operators for
assignment, such as ":=" or my own favorite, "<-", but somehow they have never
caught on. "=" was good enough for Fortran in 1956, by god -- it's still good
enough 60 years later!

~~~
na85
>...my own favorite, "<-", but somehow they have never caught on.

No doubt because "<-" is two characters and requires the shift key, whereas
"=" is only one, unshifted character.

------
pjmlp
It is not the programmer culture, rather the UNIX programmer culture.

The Mac OS programmer culture before Mac OS X was a thing and Windows
programmer cultures, including the Xerox PARC ,Lisp Machines and ETHZ ones,
have more to do with user culture than what the author is writing about.

I really dislike this hipster idea that the only programmer culture is the
UNIX one.

------
willtim
I'm not sure anyone would be impressed by Python via command line and
nano/pico, beginner or otherwise. An interactive notebook with
graphics/animations would be the best introduction. Something like this:
[http://haskellformac.com](http://haskellformac.com)

~~~
nishac
I agree. I would suggest making simple demos on IPython notebooks.

------
emodendroket
It seems like maybe you'd be better off teaching these users VBA or something.

------
sbierwagen
The purpose of programming is to do things _at scale_ , to automate them.
Writing a one-off Python script to search for the string "widget" is insane
when Spotlight exists. But a script that performs searches is useful when you
need to search for a thousand strings. Using the Imagemagick CLI to resize a
single image is ridiculously cumbersome when Photoshop exists, but vastly more
useful when you need to resize a folder with 50,000 jpegs in it.

The reason newbies question the usefulness of trivial examples is because
they're trivial problems. Programmer tools are only useful for big problems.

------
yusee
If programming were easier, why would coders make six figs? Doctors and
lawyers have regulatory bodies that protect them. We just make make our jobs
too frustrating for non-nerds.

------
squeaky-clean
I don't see any example of the two cultures here. Just an unprepared teacher
who chooses bad examples to teach from.

The counter-arguments by hypothetical students actually make good points, and
I feel are the best ways to solve that certain problem. They know to use the
best tools available for their task. The problem is all their tasks are really
simple, and the "developer tools" would be a slower solution, or at least not
faster.

Why do you need to search for all uses of the word 'widget' in python files?
Did you write a script, but you forgot the file name and which directory it
was in? Any search tool will work for that. Do you want to replace all
instance of "widget" with "module" but only within your python scripts? Now
the shell commands seem more appropriate.

Why can't they just plot a graph with Excel? If you just want to view the line
graph from some data points use whatever tool you know best. You can then
share that Excel file with anyone else who owns Excel but doesn't understand
programming, or export the charts it to a non Excel format I'm sure. But
what's that? You want to serve the charts online? Well make them an image and
host that. What's that? They need to be dynamic and based on the user? Okay,
now we need to code something. What's that again? You just want a chart of
some data points for yourself, but you need to compare it against data from an
API call? Write something in Python to fetch from that API. You don't know how
to use matplotlib? Well output it to a CSV and use Excel for your charting.

The "copy, rename, email" is a bit of an ugly system. But it works, especially
for simple student projects. Pretty much any undergraduate I've tutored did
this (I guess the ones that knew git didn't need tutoring), and it only starts
to become a burden when you get to group projects. The other advantage of git,
having a logged history of all your changes, possibly spread across branches
for different purposes or features, isn't really apparent when your longest
homework assignment is a 4 week project. Git is really valuable when you're
working on a team, working on something large enough that you can't keep it's
history in your head, or you need really quick access to that history ("I just
got an alert that our registration page is giving 500 errors after the last
update. What did we change?"). I don't see a reason to confuse students who
are already trying to understand so many other new things at once.

I'm not a professional teacher, so you can take all this with a grain of salt.
I think the best way to teach programming is for the students to actually be
chasing after something they want to finish. Not just synthetic tasks. No one
takes a programming course because "I want to learn Python!". They take it
because "I want to make apps / websites / video games / do my math research
more easily / automate parts of my job" and so on. And then they're not just
learning about loops and control structures because "it's going to be on the
exam". They're learning about them because otherwise, your game of blackjack
ends after 1 deal, and that's no fun.

You really should be able to provide a reason why you're using a certain tool,
and not another. Why are you using Python instead of Excel to plot a graph?
You're just faster with Python? That's a valid reason. But "because my
professor is faster with Python" is not.

------
andrepd
I firmly dislike this style of writing. I call it ADD writing. It took 3x as
many words to write and 5x as much time to read than if he had just gone
straight to the point, instead of going off on some over-the-top rant or
irrelevant tangent every other sentence.

~~~
TheOtherHobbes
I agree. This is a superficial and not very original or insightful piece. It
doesn't take genius to describe the difference.

You can take almost any field - building, architecture, engineering, law - and
explain the variation in approach between a user and a professional.

Different expectations? Yes

Different experience? Yes

Different culture? Yes

Different skills? Yes

Different tools? Yes

Why should computing be any different?

It worries me that someone who is an associate professor is making such a big
deal out of this.

He may have a point that students who are used to a GUI are going to have some
issues getting used to writing raw code. But it's not as if IDEs don't exist,
or that you have to be an expert on the Bash shell to write an iPhone app.

I'm really not seeing the problem.

