
Early vs. Beginning Coders - jessaustin
http://zedshaw.com/2015/06/16/early-vs-beginning-coders/
======
schoen
Is there a book somewhere that tries to set out all of the things that experts
know about computing that they don't remember learning?

(In Zed Shaw's conception, this might correspond to "learn computing the hard
way".)

I see his examples and other examples here in this discussion, and it makes me
wonder about the value (or existence) of a very thorough reference.

I've also encountered this when working with lawyers who wanted to have a
reference to cite to courts about very basic facts about computing and the
Internet. In some cases, when we looked at the specifications for particular
technologies or protocols, they didn't actually assert the facts that the
lawyers wanted to cite to, because the authors thought they were obvious. I
remember this happening with the BitTorrent spec, for example -- there was
something or other that a lawyer wanted to claim about BitTorrent, and Bram
didn't specifically say it was true in the BitTorrent spec because no
BitTorrent implementer would have had any doubt or confusion about it. It
would have been taken for granted by everyone. But the result is that you
couldn't say "the BitTorrent spec says that" this is true.

Another example might be "if a field is included in a protocol and neither
that layer nor a lower layer is encrypted with a key that you don't know, you
can see the contents of the field by sniffing packets on the network segment".
It might be challenging to find a citation for this claim!

So we could also wish for a "all our tacit knowledge about computing,
programming, and computer networking, made explicit" kind of reference. (I'm
not sure what kind of structure for this would be most helpful pedagogically.)

~~~
MattGrommes
Charles Petzold's book 'Code' has a lot of this very basic, low-level
information. It basically builds up from basic information theory to
computers. It's not going to have everything you're looking for but I was
surprised how much of it I "knew" without recalling where I learned it or how
it connected to other things.

~~~
mdpopescu
I had a similar experience with Code Complete (I know most of what's in it but
I have no idea when I learned it). I don't know if it's as basic as @schoen
wanted but it might be close.

~~~
WalterGR

        I don't know if it's as basic as @schoen wanted
        but it might be close.
    

I remember _Code Complete_ as being a guide for advanced developers on how to
take their craft to the next level.

Does my memory deceive me?

~~~
btilly
You will likely have seen advanced developers recommending it. But it is meant
to teach routine daily stuff like how to name variables, lay out functions,
why we want abstraction layers, and the like. And it really is aimed at people
who may know nothing about code construction.

~~~
WalterGR
Okay, thanks. I read it cover-to-cover before I interviewed at Microsoft my
senior year in college. But - like the developer in Zed's article - I must
have forgotten. :)

------
SCHiM
This is good. I've had problems that were somewhat related to what the author
talks about.

When I was learning C# and was already quite fluent in C/C++. I had a big
problem with the C# type system/management. I'd been reading guides that were
in the first category the author mentions, eg. "not really a beginner, but new
to this language".

I was trying to retrieve the bytes that a certain string represented. I was
looking for ages and everywhere everyone mentioned that "this shouldn't be
done", "just use the string", etc. A stack overflow answer mentions a way to
use an 'encoding' to get the bytes and this seemed to be the only way.

How strange I thought, I just want access to a pointer to that value, why do I
have to jump through all these hoops. None of the guides I was reading
provided an answer, until I found a _real_ beginners book. This book,
helpfully starting at the real beginning of every language: the type system,
finally gave me the answer I was looking for:

.net stores/handles all strings by encoding them with a default encoding. It
turned out that the whole notion of 'strings are only bytes' that I carried
over from C++ does not work in C#. All those other helpful guides gleefully
glossed over this, and started right in at lambdas and integration with
various core libraries. Instead of focusing at the basics first.

~~~
lmm
That's only "basics" if you've got the wrong idea. There are millions of
possible mistakes, no beginners' guide can explicitly address every one.
People told you to just use the string - wasn't that a good enough answer?

~~~
orbitur
> People told you to just use the string - wasn't that a good enough answer?

"Don't do that" isn't a sufficient answer without explaining exactly why,
though. And if you aren't asking the right question, then the explanation
might even seem obtuse.

~~~
lmm
"Don't do that" is the right answer when you're asking the wrong question.
It's an invitation to take a step back and ask how to do what you actually
want to do, at a higher level.

~~~
amalcon
No; the correct answer in that case is "Why are you trying to do that?"

~~~
WalterGR
That's what pedants do.

When I was a kid learning line number BASIC, adults answered my questions
knowing that I'd figure out The Right Way before anyone hired me to write the
code for radiation treatment devices.

The tech community's obsession with "The Five Why's" is toxic. When asking
questions, you always have to first prove that you deserve an answer. It
becomes a process of trying to anticipate any potential reason someone might
have to argue "You're doing it wrong" \- and preempt that. You can't just ask
a question: you have to both ask and justify.

Mostly I just don't bother, and I suspect that I'm not alone. And I have a
degree and industry experience. It must be incredibly frustrating and
discouraging for beginners.

~~~
sbov
Dunno. If someone asked me how to get the bytes of a string, I would ask why.
Not because theres The Right Way to do things, but because they might be doing
things The Hard Way.

A why can reduce the amount of code written by 100%.

~~~
derefr
Sure, but when someone is first learning, frequently the _true_ answer to "why
are you doing that" is "to see what happens" (even if they have some flimsy
justification within their pet-project at the time.) Giving them the answer
lets them go back to experimenting so they can see, for themselves, why the
path they're heading down might not be such a good idea. Formative experiences
and such.

~~~
sbov
Yeah, that's a fine answer. But they might just not know of the other way to
do things.

Like, in Java, for a long time I didn't know there was an output stream that
you could write a string directly, so I was always getting the bytes to write
it. I wouldn't call that a formative experience.

------
danso
I've been teaching coding to beginners for the past year now...and even after
having done coding workshops/tutorials for many years previous, I've found I
can never _overestimate_ how wide the knowledge gap is for new coders.

Yesterday I was talking to a student who had taken the university's first-year
CS course, which is in Java...she complained about how missing just one
punctuation mark meant the whole program would fail...While I can't
passionately advocate for the use of Java in first-year courses (too much
boilerplate, and the OOP part is generally just hand-waved-away)...I've
realized that the exactness of code _must_ be emphasized to beginners. And not
just as something to live with, but something to (eventually) _cherish_ (for
intermediate coders, this manifests itself in the realization that dynamic
languages pay a price for their flexibility over statically-typed languages).

Is it a pain in the ass that missing a closing quotation mark will cause your
program to outright crash, at best, or silently and inexplicably carry on, at
worst? Sure. But it's not _illogical_. Computers are dumb. The explicitness of
code is the compromise we humans make to translate our intellectual desire to
deterministic, wide-scale operations. It cannot be overemphasized how dumb
computers are, especially if you're going to be dealing with them at the
programmatic level...and this is an inextricable facet of working with them.
It's also an _advantage_...predictable and deterministic is better than
fuzziness, when it comes down to doing things exactly right, in an automated
fashion.

I think grokking the exactness of code will provide insight to the human
condition. While using the wrong word in a program will cause it to fail...we
perceive human communication as being much more forgiving with not-quite-right
phrasing and word choices? But is that true? How do you know, really? How many
times have you done something, like forget to say "Please", and the other
person silently regards you as an asshole...and your perception is that the
transaction went just fine? Or what if you _say_ the right thing but your body
(or attire) says another? Fuzziness in human communication is fun and
exciting, but I wouldn't say that it's ultimately more _forgiving_ than human-
to-computer communication. At least with the latter, you have a chance to
audit it at the most granular level...and this ability to _debug_ is also
inherent to the practice of coding, and a direct consequence of the structure
of programming languages.

~~~
bwy
This is pretty ridiculous, man. I don't think I know a beginner programmer who
would be so stuck on "every character matters." (Which isn't even true, to
some level, in many langauges - ; in JavaScript and Python? Whitespace in
languages besides Python?)

The way I would explain it is to have them take a imagine writing a code
tokenizer and interpreter of a simple language themselves. That's what the
intro CS class I took at Berkeley, 61A, had us code with a subset of Lisp,
with a lot of help, of course. I don't think we needed to know how to use
anything but strings, functions, and arrays, although it did involve
recursion. This problem will never be broached again once they realize there's
code reading their code. Of course it's arbitrary.

(project, in case you're curious: [http://www-
inst.eecs.berkeley.edu/~cs61a/fa14/proj/scheme/](http://www-
inst.eecs.berkeley.edu/~cs61a/fa14/proj/scheme/))

~~~
danso
> This is pretty ridiculous, man. I don't think I know a beginner programmer
> who would be so stuck on "every character matters." (Which isn't even true,
> to some level, in many langauges - ; in JavaScript and Python? Whitespace in
> languages besides Python?)

I guess YMMV...but most beginners I've worked with are confounded by why code
interpreters are so literal. The double-equals sign versus an equals sign is
one prominent example...it's not that they can't understand _why_ rules
exist...but they seem to think the negative consequences (complete program
failure) seem to outweigh the tininess of the error.

After dealing with `=` vs `==` errors in beginners' code...something that I
almost never screw up on my own as a coder...I've begun to respect the
convention in R to use `<-` as the assignment operator...

~~~
xentronium
Notably, pascal uses `:=` and is one of the better first languages in my
opinion for many more reasons (easy to grasp language core, simple non-null
terminated strings, no actual need to learn pointers until the very advanced
stages). Today I mostly advice other people to start with python though,
because pascal feels somewhat dated and undertooled.

~~~
icebraining
I think Python's behavior of disallowing assignment in expressions is good
enough to avoid those mistakes; it's an anti-pattern anyway in the vast
majority of cases.

------
top1nice1gtsrtd
I actually worked on teaching my 71 year old father Python using this book.
One point of difficulty that struck me during that exercise was that I as a
programmer had completely internalized the idea that an open paren and a close
paren right after a function is a natural way to invoke a function with zero
arguments (e.g.: exit() exits Python's prompt. exit doesn't.). The whiplash I
felt from finding the questioning of the convention silly to finding the
convention silly was amusing to feel. Like it makes sense to a parser but not
to a flesh-and-blood contextual-clues-using human. We don't vocalize "open
paren close paren" whenever we say an intransitive verb. We just "know" that
it's intransitive. Anyway, great article.

~~~
jackmaney
Perl doesn't require "()" after the name of a function to call the function.
In fact, parentheses aren't required for function calls at all.

The code:

    
    
        use strict;
        use warnings;
        
        sub foo { print "In foo\n"; print "args = " . join(",",@_) . "\n" if @_; }
    
        foo;
    
        foo "a", "b", "c";
    

yields the output:

    
    
        In foo
        In foo
        args = a,b,c

~~~
tehwalrus
Neither does ruby.

(Which is really really confusing if you spend your days in python, and
occasionally have to edit something in ruby.)

~~~
rhinoceraptor
It's even more confusing if you have to figure out if you're calling a method
or referencing a variable.

The code could be assigning a new variable from a method return value, or just
from an existing variable.

------
kazinator
I can still visualize what it's like to know nothing, because when I saw a
BASIC program for the first time when I was ten, I thought the = signs denoted
mathematical equality (equations). How the heck can X be equal to Y + 1, if in
the next line, Y is equal to X - 2?

Later, I tried using high values for line numbers just for the heck of it. Can
I make a BASIC program that begins at line 100,000 instead of 10? By binary
search (of course, not knowing such a word) I found that the highest line
number I could use was 65,000 + something. I developed the misconception that
this must somehow be because the computer has 64 kilobytes of memory.

~~~
scribu
> I thought the = signs denoted mathematical equality (equations).

I had the same confusion! My very first roadblock in programming was when the
teacher told me to write `x = x + 1` on the blackboard, which didn't make any
sense, mathematically.

------
jordanpg
The only important trait I see that matters for either of these groups is a
willingness to try things, push buttons, see what happens.

A beginner worries about breaking the computer and doesn't yet understand that
any question they have can be typed into a search engine _verbatim_ and will
probably be answered with 20 SO posts and 50 blogs posts. And early programmer
is stumbling down this road.

I don't know that this ethos can be communicated with a book.

I would also recommend that beginners/early programmers learn 1 programming
language really well, and ignore the din of people on the internet who claim
to effortlessly, expertly jump among 10 languages as part of their day-to-day.

~~~
loup-vaillant
> _I would also recommend that beginners /early programmers learn 1
> programming language really well_

That's a dangerous approach. The first language is very hard to learn,
because, well, it's your first. And when you stick to _one_ language, you
easily conflate the syntax and the semantics.

So when you learn a second language, you have to _unlearn_ the syntax of the
first, in addition to learn the genuinely different concepts. Distinguishing
the similar stuff in new clothes from the actual new stuff is hard. Simply
put, learning the second languages will be very hard as well.

Now your programmer has two data point, and knows with their gut that learning
a new programming language is _hard_. This sets expectation, an will make it
harder to learn additional languages. It will take some time to realise
learning a new language, besides a few insane exceptions like C++, is not that
hard.

> _ignore the din of people on the internet who claim to effortlessly,
> expertly jump among 10 languages as part of their day-to-day._

Jumping from language to language may not be that easy. But one can certainly
be an expert at 20 programming languages. Once you see the commonalities,
there isn't much to learn. Really, a good course in programming languages is
enough to get you started. The hard part is memorising 20 big programming
_frameworks_ , with all their warts, special cases and so on. Still, if you
know the concepts, learning the vocabulary takes little time.

~~~
jordanpg
I just don't see it. No one would ever recommend learning Spanish and Mandarin
at the same time, for any reason.

All of the things you said are true, and yet the beginner has only so much
time, so much patience, so much learning to do in one day.

Given this, I see larger advantages to spending all of that time and energy in
_one_ ecosystem. There are many perspectives on, say, Java coding styles,
patterns, and idioms. One need not go outside a language to do that.

And I would further argue that you simply cannot (usefully) see the global
commonalities and idioms among languages until you've been doing this for a
while. Years.

As for experienced programmers, I've not known any "experts" at 20 languages,
ever. My point was really that this idea is simply the result of run-of-the-
mill internet hyperbole.

~~~
macNchz
Learning two programming languages at the same time is definitely not
comparable to learning Spanish and Mandarin at the same time...those two
languages are so different that you won't gain anything from it. Learning,
say, Spanish and Italian at the same time might be a better analogy, since
you'll start to see word roots and constructions that are shared among romance
languages.

I think learning more than one programming language at the same time is a
great way to help you tease out basic programming concepts from the vagaries
in the syntax of an individual language. How to types work? Scopes? Functions?
Loops? Arrays? Hash tables? Those are all things that, once you really grok as
separate from, say, whitespace problems in Python or curly brace issues in C,
allow you to much more easily read and eventually pick up other languages.

~~~
codexjourneys
Although I can't speak to learning two spoken languages simultaneously,
learning a language similar to one I already knew (I knew Spanish, tried to
learn Italian) was insanely difficult, because my brain couldn't distinguish
them enough. It would get in a loop of searching for Italian words and running
into Spanish words and then mixing them up.

On the other hand, learning German after already knowing Spanish was much
easier -- and it was much easier to see the similarities and differences
between the languages, because my brain would find the Spanish phrase while
searching for the German one, and vice versa, but wouldn't get stuck in a loop
about it.

------
mcgrootz
Zed Shaw is a natural when it comes to teaching beginners. I recommend his
"Learn The Hard Way" books to everyone who is interested in learning to code
because they make zero assumptions and start at the VERY beginning. It's
stupidly hard to find great books for complete noobs.

I'm totally behind this distinction, and I hope more content publishers adopt
something like this.

~~~
dvanduzer
If you're behind this distinction, I'd ask that you abandon the negative slang
about someone who is new to a topic.

The hacker tradition is revered in large part because it seems to be so
egalitarian. We all came from humble beginnings. Anyone who can grasp enough
of the mathematics of the stuff can make the gizmo do something magical.

It's a common to be self deprecating about our former ignorant selves, and
maybe some people find it encouraging to hear something like "don't worry, I
was once a noob myself." But really, the word emphasizes the moments where you
felt like an idiot. That doesn't help a beginner.

[https://en.wikipedia.org/wiki/Shoshin](https://en.wikipedia.org/wiki/Shoshin)

~~~
Nimitz14
or use the normal term: newbie

------
rday
I was bitten by this as well, I thought the book was for an "early programmer"
not a total beginner.

Hindsight and all, it seems the book would have better titled "Learn to
Program the Hard Way (using Python)". Or "Learn to Program the Hard Way (using
Ruby)". A total beginner is really trying to learn how to build a program, not
trying to learn a particular language (whether they know that or not).

~~~
Roodgorf
I think your parenthetical at the end is most important to the marketing
strategy behind the way the book is named as opposed to your suggestions. In
my experience discussing with people who do want to learn to program, their
first question is generally "What's the best/easiest language to start with?".
Having no knowledge of coding whatsoever leads one to focus on comparatively
superficial things like language, so I can imagine more beginners being drawn
to "Here's how to use Python" vs. "Here's how to program".

------
huuu
This is a nice article.

I think it took me three years to understand what a variable was. And I still
don't know why it took me so long to understand and why I suddenly understood
it.

It's not that I didn't know that assigning '1' to 'a' would result in 'a'
having a value of '1', but I didn't understand the concept and workings behind
it. I just thought it was magic.

~~~
tel
There's something interesting here too in that what many call variables are
actually a bit more like "assignables". The upshot is that only in programming
do variables behave this way---distinct and unlike mere "names" which we're
more familiar with from day-to-day life.

So often one "learns (programming) variables" in how they're implemented
instead of merely what they _mean_. Their meaning is much more hairy than mere
naming.

~~~
zeroxfe
Actually, the brunt of the confusion is not the variable, but the '=' sign,
which in mathematics means 'is equal to', while in a programming language
means 'assign to'. This indirectly changes the semantics of the variable
within the statement, and confuses people.

This is why `x + 5 = 10` makes sense in mathematics, but not in a programming
language.

~~~
lmm
This is one case where I wish Pascal had won. := for assignment, = for
comparison.

~~~
vezzy-fnord
Dates back to one of the ALGOL dialects, actually.

------
uniclaude
IMHO, Zed is right. I have been looking for books targeted to beginner
programmers so I could recommend them to my friends, but most books
unfortunately fail on this point.

A Notable exception I found is "Learn you a Haskell for Great Good!". It is as
good for beginning coders as it is for early (or advanced) ones.

The author made the effort to describe some relatively basic things, and it
was simple enough (okay, with a few calls to me here and there) for an Art
major friend of mine to start with programming, and with Haskell. I can't
recommend this book enough.

~~~
technomancy
> I have been looking for books targeted to beginner programmers so I could
> recommend them to my friends, but most books unfortunately fail on this
> point.

My favourite book for this by far is How to Design Programs:
[http://htdp.org](http://htdp.org)

It assumes knowledge of arithmetic and maybe a tiny bit of algebra but not
much else.

------
morganvachon
I feel like I'm perpetually stuck between what the author describes as
"beginner" and "early". I understand what programming is, I can write a bash
script that does what I want it to (granted, I have to read a ton of man pages
to make sure I understand what it is I want to accomplish), I can write simple
programs in Visual Basic or Python or Javascript that do simple tasks. I
understand program flow, logic, and all the basics of high-school level
algebra.

The problem is, I can't wrap my head around many of the concepts I read about
here in the HN comments and elsewhere on programming blogs and such. No matter
how much I try to understand it (and by understand it, I mean fully grasp what
the person is talking about without having to look up every other word or
phrase), I can't seem to put it all together. Things like inverted trees,
functional programming (I've heard of Haskell and I'd love to learn it, but I
have no head for mathematics at that level), polymorphism, and so on.

Maybe I need to just practice more; maybe I need to pick something interesting
from Github and dive into the code to try to understand it better (preferably
something well documented of course). Or maybe I need to just stop, and accept
that I can whip out a script or simple web thingy if I really need to, and
stick to being a hardware guy, which I'm actually good at.

~~~
octatoan
Haskell does not need you to learn mathematics (category theory, if you've
heard of it).

Grab Learn You A Haskell[1] and have fun. ;)

[1]: [http://learnyouahaskell.com](http://learnyouahaskell.com)

~~~
klibertp
> Haskell does not need you to learn mathematics

...but it makes you feel stupid if you don't. Better to use "Real World OCaml"
if you're more interested in the ideas themselves than in their formalizations
or related nomenclature.

~~~
morganvachon
> ...but it makes you feel stupid if you don't.

Exactly. That's where I'm at right now; I know what Haskell is, I love the
idea of it, I've enjoyed some of the fruits of it (XMonad). But it was when I
tried to dig deeper into it that I felt lost, and yes, stupid. I've never been
a math whiz; I am great at visualizing concepts but truly grasping the theory
behind them is where I get lost. Based on my junior high school testing, I was
placed in Advanced Algebra in my first year of high school. I nearly failed
the class because it took me all year to grok the distributive property. I
look back on that and I feel ashamed, because once I understood it, it seemed
so damn simple! And so it is when I try to advance beyond my current level of
programming skill; I hit brick walls and I feel like I left my sledgehammer at
home. My pocketknife, even though I know every millimeter of it, won't cut
through those walls.

~~~
klibertp
I don't think the concepts are hard to understand, I think that - in Haskell -
they're just being presented in a way that is incompatible with my way of
thinking.

Having found Haskell materials as simply not suited for me I decided - quite a
few years back - to learn Haskell (or the concepts behind Haskell, at least)
my own way: by learning first Erlang (it sounded cool), then Scheme (mainly to
be able to read many, many papers that use it), then OCaml and Scala (because
the type systems and pragmatism) and finally Clean (to fill the last gaps in
my knowledge). I progressed from dynamic to static typing and from eager to
non-eager evaluation. It took me I think about 2 years to do all this and, of
course, it wasn't that easy, but somewhat surprisingly it worked. I never
wrote - and I'm not sure I ever will, but that's a completely different matter
- any non-trivial Haskell code, yet I'm able to read and enjoy Haskell-related
papers.

It's important to realise that there is always more than one way to learn
things. You should know yourself well enough to see when the "normal" way
simply isn't for you; this way you can go search for alternative ways. I
guarantee that you'll find them, if you search hard enough :)

------
r0mbas1c
I have been thinking this for years.... though I would consider myself an
"early coder" according to the article.

This stuck out to me as being just the beginnings of the quintessential issue:

    
    
       A beginner’s hurdle is training their brain to grasp the concrete problem of using syntax to create computation and then understanding that the syntax is just a proxy for how computation works. The early coder is past this, but now has to work up the abstraction stack to convert ideas and fuzzy descriptions into concrete solutions. It’s this traversing of abstraction and concrete implementation that I believe takes someone past the early stage and into the junior programmer world.
    

But why stop at just "beginner" "early" and "advanced". All of the books I
have on programming are either truly "beginner" or blankly labeled as a
programming guide, when in actuality it is quite "advanced"...nothing in
between.

If, as the article states, 4 is the magic number of languages to learn up
front, perhaps there should be a 4th level of programming guides....one for
the journeyman who knows the syntax, can articulate the complex algorithmic
issues that need to be addressed, but isn't quite at that "mastery" or
"advanced" level.

~~~
chadzawistowski
Please don't use code tags for quotations! It doesn't wrap lines and forces
readers to scroll in order to read the whole quote.

Code tags are okay if you manually line-wrap, or you can precede quotes with a
> and everyone should recognize it for a quote.

------
VeejayRampay
Programming is a frustrating job, you're pretty much doomed to be a beginner
forever. It's part of what makes it exciting day in and day out, but it can
also be overwhelming.

~~~
jerf
No, there's definitely an underlying substrate of significant commonality
between the various programming languages and technologies. If you're at 10
years in and you still feel like a beginner, you're doing something wrong.

Obviously I can't expect to pick up a brand new technology and instantly
expect to be a wizard, but I do expect that I can pick up a new technology and
be functioning at a high level in a week or two, tops, because it's almost
certainly just a respelling/reskinning of some technology I've used before.

(The whole "young guys who know way more than their old-fogey elders" was, in
my opinion, an isolated one-time event when we transition from mainframe tech
to desktop tech. Despite its recurrence on HN, I think "age discrimination" is
naturally receding and will just naturally go away as the people on this side
of that transition continue to age, and skill up.)

~~~
MrDosu
If you pick up a new tech and it's " just a respelling/reskinning of some
technology I've used before" you are doing something very silly or are not
using new tech at all. If it's basically the same there is no reason to
switch.

~~~
rgbrenner
jerf is right.. you have a mental model of how a programming language works,
and a new language is just changing the syntax used to represent those same
concepts.

Does that mean there's no reason to switch? No.. since some languages are
better at representing some ideas; some languages have better abstractions for
certain ideas; etc.

I think starting with a lower level language helps with this way of thinking.
If you learn everything about C (for example), and later learn a higher level
language, it's easy to think of how you would implement a certain feature of
the higher level language.

~~~
c22
> I think starting with a lower level language helps with this way of
> thinking. If you learn everything about C (for example), and later learn a
> higher level language, it's easy to think of how you would implement a
> certain feature of the higher level language.

I cannot agree with this more. A lot of people seem to think that a "better"
language to learn programming with is one that is "easier" or "more
forgiving", but everyone I know who started with C became excellent
programmers whereas ability among the group who started with something else is
somewhat more hit or miss.

~~~
rgbrenner
Yes.. I think it's because C forces you to think about things you would never
have to think about in a higher lang.

For example, I know exactly how garbage collection works.. since I once had to
write a GC for a project. So when I use a higher lang, that part isn't
magic... it's just something someone else already wrote for me.

Whereas, if you started with a higher level lang, you could get by without
ever learning how a GC works. Yes, you could dive into the details of your
language, but there's no requirement for you to do it.

And I think that explains what you've noticed... it's hit or miss because
those who chose to dive into the details of their lang eventually became
excellent developers...

------
merrickread
Last fall I went through a coding bootcamp in Toronto. It was 9 weeks of hard
work sprinkled with lots of frustration and lots of feel good successes. A
main takeaways I had was everyone comes in with a different background and
everyone has a unique approach to learning.

The problem expressed in this article is a fundamental bottleneck of
education. The communication between teacher and student is often
misinterpreted at both ends and the subject matter is never perfectly conveyed
or received.

I feel what really lacks in the learn to code community is teaching one how to
actually learn. Lay a positive attitude towards failure and a framework of
problem solving first, the content and understanding of a language will come
after.

------
pbreit
No. Just change the name of the book to "Learn Programming The Hard Way
(Python Edition)". By putting the language in the title it sounds like it is
for an experience programmer learning a new language, not for learning how to
program.

~~~
lukas099
True. When I started a major in CS, I had no programming experience. I started
"Learn Python the Hard Way", but when I found out that I needed to learn C for
Compilers, I tried to switch to "Learn C the Hard Way", thinking the two books
were equivalent. They weren't.

------
natural219
This is a fantastic article and is another great example to pile on as to why
Zed Shaw is the king of programming teaching.

One area I struggle with in tutoring is how to inspire/invoke/detect
_disciplined motivation_. What I mean is, whenever I sit down to show someone
something, I'm constantly questioning myself "wait, do they _actually_ want to
learn this level of detail, or am I just giving too much information that's
going in one ear and out the other?" If someone is _definitely_ motivated to
learn that's great (and really inspiring for me as a teacher to do better at
explaining things precisely).

If this nomenclature were more understood, I would like to say something like
"Sorry, what you're trying to do is more of an early/junior task, and right
now you need to stick with the Beginning basics". I just don't know how to
phrase that without sounding condescending.

------
michaelfeathers
Zed points to a very real problem - it's easy for us to forget what we know.
But there's another problem with targeting beginners. They are all over the
place in terms of experience.

Computing is so tightly woven into our world not that it's hard to find people
who have more than a passing interest in it who have not find some way to try
to code as kids. Even with those who haven't, there's a gulf between people
who have tried to do a little HTML editing (and know what a file is) and
people who haven't. There's no one place to start. From Zed's description it
looks like he's starting from the lowest possible point, but what are the
demographics like there? How many people are in that space and are they mostly
adults or children?

I think this is one of the main reasons why you don't see much beginner's
material.

------
MarcScott
"My favorite is how they think you should teach programming without teaching
“coding”, as if that’s how they learned it."

I often wonder about this. In the UK, with the drive to get every child
'coding', there are a large number of teachers that constantly talk about how
the main skill that we should be teaching is 'Computational Thinking'.

I wax and wane back and forth over this topic, in a very chicken and egg way.
However, I usually end up coming to the conclusion that learning computational
thinking is great, but you need to know how to code (i.e. learn the basic
syntax of a language) before you can possibly learn how to think
computationally.

I would be very interested to hear the opinions of actual developers, as to
their opinions on the topic.

~~~
sirclueless
I don't think these things are all that different, in the very early stages.

The very first part of computational thinking is understanding that you can
make a very specific and precise procedure to accomplish a task. If a student
is at an age where reading and writing is easy, then learning the syntax of a
language is a fine way to accomplish this. The student will spend a lot of
time with each finicky word and symbol to make the computer behave, and while
they may not recognize that they are defining an abstract procedure, the
result is hopefully some intuition that the computer is a very predictable and
reliable machine that does exactly what the code says, even if it's not what
you meant. With exposure to more languages and by writing more programs,
hopefully a student begins to recognize patterns and abstractions in their
code, and that's the point at which they become real computational thinkers.

If a student isn't ready for that, there are still fun things to try. One cute
one I've seen is "program your parent" exercise at a workshop. The child can
make their parent move one step forwards or back, turn left or right, pick up
and put down an object, and put one thing inside another. Can they make their
parent pour a glass of juice? Or put a lego back in the box?

I don't think there is a chicken and egg problem here, because learning to
make a dumb machine perform a task by following a procedure is the essence of
computational thinking. Learning the basic syntax of a language is probably
the most efficacious way to experience this for many students of many ages,
even if the explicit goal is "do well on an AP test" or something mundane.

------
NTDF9
Part of the problem is that there are too many things each language can now
do. Every single language wants feature parity with every other language.
Every single language wants to do everything.

This means, an expert in one language is going to be "Beginner" instead of
"Early" in some some ways...but "Early" instead of "Beginner" in other ways.

Anecdotally, as a software engineer working with C++, I had to spend a whole
months trying to understand event-driven programming of other languages. I
didn't really need tutorials on loops and recursions but I sure as hell needed
to understand how a typical program in that language works.

------
smilefreak
Great article.

I am a instructor for Software Carpentry[1] , the goal of these workshops from
my experience is to try and help mostly scientists get started on the journey
to becoming _early_ programmers.

In biological sciences with more and more data becoming available, the Expert
blindness Zed speaks of is a major problem. We need to invent better systems
and actually take heed of research based teaching methods as SW does if we
wish to improve this situation.

[https://software-carpentry.org/](https://software-carpentry.org/)

------
orbitingpluto
The article reminds me of how math textbooks/topics are labelled:

    
    
      Elementary Differential Equations (third year math)
      Elementary Symbolic Dynamics (grad-level)

~~~
clebio
Oh, yes. Thank you.

> An Introduction to Group Theory

(500 pages)

------
cafard
A useful distinction.

------
OAR
I tried to contact Zed about a month back, to ask him this question.

I tried though his blog comments, and at the help email he has for his HTLXTHW
courses, but never got a response.

I dunno if he just never noticed it, or if he's actually ignoring me for some
reason, but having already typed out this question with all the necessary
context, I figure I may as well post it a public place where it's relevant, so
here:

Hi Zed,

So, I found [this comment of yours on
HN]([https://news.ycombinator.com/item?id=1484030](https://news.ycombinator.com/item?id=1484030))
by googling:

> site:[https://news.ycombinator.com](https://news.ycombinator.com) zedshaw
> engelmann

and was pleasantly surprised find you explicitly mentioning Siegfried
Engelmann and Direct Instruction.

Here's the story:

I learned about your "Learn X the Hard Way" series through a friend who had
learned Python from your course.

He told me he heard you knew about Zig and DI.

I immediately said something like:

> Nah, pretty much nobody has heard about DI, much less properly appreciates
> it.

> Probably Zed just meant lowercase "direct instruction" in the literal, non-
> technical sense of "instruction that is somehow relatively "direct"".

> He's probably never heard of uppercase "Direct Instruction" in the technical
> sense of "working by Engelmann's Theory of Instruction".

But then I googled, and yeah, aforementioned pleasant surprise.

(I am just not going to say anything, outside of these brackets, about
"blasdel" there.

If medicine was like education, the entire field would be dominated by the
anti-vaxxers.

Hey, blasdel! supporting "Constructivism" is morally at least as bad as
supporting anti-vaccination!

Bah, whatever. Okay, got that out of my system. Anyway. xD )

So now I'm really curious:

You said you " __learned quite a bit about how to teach effectively from [Zig
and Wes] __".

But _how_ did you learn from them?

You haven't slogged your way through the "Theory of Instruction: Principles
and Applications" text itself, have you?

I have, and _wow_ was that a dense read... Which is frustrating, because as
you're reading, you can see, abstractly, how they could've meta-applied the
principles they're laying out to teaching the principles themselves --[the
open module on Engelmann's work at
AthabascaU]([http://psych.athabascau.ca/html/387/OpenModules/Engelmann/](http://psych.athabascau.ca/html/387/OpenModules/Engelmann/))
includes a small proof-of-concept of that, after all-- but apparently they
just didn't feel it was worth the extra work, I guess...?

(Zig _did_ [say that the theory is important for
"legitimacy"]([http://zigsite.com/video/theory_of_direct_instruction_2009.h...](http://zigsite.com/video/theory_of_direct_instruction_2009.html))
--ie, having a response in the academic sphere to the damn "Constructivists"
with their ridiculous conclusion-jumping-Piaget stuff and so on-- and that's
the only practical motivation I've ever heard him express for why they wrote
that tome in the first place.)

Have you read any of the stuff he's written for a "popular" audience, like
these?:

\- [Could John Stuart Mill Have Saved Our
Schools?]([http://www.amazon.com/Could-John-Stuart-Saved-Schools-
ebook/...](http://www.amazon.com/Could-John-Stuart-Saved-Schools-
ebook/dp/B006YY1WCQ/ref=sr_1_1?s=books&ie=UTF8&qid=1424448205&sr=1-1&keywords=siegfried+engelmann+schools+mill))

\- [Teaching Needy Kids in Our Backward
System]([http://www.amazon.com/Teaching-Needy-Kids-Backward-
System/dp...](http://www.amazon.com/Teaching-Needy-Kids-Backward-
System/dp/1880183005))

\- [War Against the Schools' Academic Child Abuse]([http://www.amazon.com/War-
Against-Schools-Academic-Child/dp/...](http://www.amazon.com/War-Against-
Schools-Academic-
Child/dp/0894202871/ref=sr_1_1?s=books&ie=UTF8&qid=1424448224&sr=1-1&keywords=siegfried+engelmann+schools+war))

(god has Zig got a way with picking titles...)

But basically what I really want to ask you, which all that was just to
establish context for, is just:

In developing your Python course, how _did_ you use your knowledge of DI?

------
alexashka
Someone's having a bad day :)

If in the world of programming - the biggest issue you're running up against
is 'this is too basic', then great :)

If it's too basic, go read something else, no problem. If you're going to get
anywhere in this world, you'll have to know how to research. Skimming and
figuring out if something is useful or not is a valuable skill - now more than
ever. So whoever complains about a well written book not suiting their fancy -
it is their problem, not yours.

------
rilita
tldr:

\- Books written for "beginners" target people who already know how to code

\- Author's book targets people before that

\- Most programmers are bad at teaching people how to code

\- Recommends some arbitrary phraseology to differentiate levels of ability

\- Until someone learns the basics of 4 languages they don't really know how
to code

\- Demands people only use the term "beginner" for people who can't code, and
"early" for those who can.

This is great and all, but it comes off mostly like a whiny complaint about
how most development books are aimed at a group of people who already have a
basic knowledge of coding.

The has already been addressed by the so called "dummy" series of books. They
were aimed directly at the audience the author is saying are being left
behind.

I'm not sure I am seeing a real issue here. Go to the bookstore, browse
through the books, pick the one you can comprehend and seems to be aimed at
whatever your level is. Done.

~~~
calibraxis
> it comes off mostly like a whiny complaint

And how would you say your post comes off?

~~~
rilita
A warning for people who value their time.

~~~
clebio
It's also a fairly short read. I cull media mercilessly, but this was easy
enough to just read, in full.

