

Why teach with Ruby? - steveklabnik
http://blog.hackety-hack.com/post/1313406925/why-teach-with-ruby

======
dusklight
I think Ruby is a beautiful language but you really got to consider it in the
context that it was invented in. Matz originally created it kind of as a
spiritual successor to Perl, and it's original incarnation was as an easy to
write, easy to read, fun to use scripting language. Rails came along later, I
don't think anyone was really expecting it before it showed up, and it changed
the Ruby conversation entirely.

Ruby is excellent in many ways but I feel like Matz never took into account
what is needed for a group of programmers to work on the same code base. I
don't want to say he completely dropped the ball or anything like that but
stuff like monkey patching and other shenanigans that make life easier if you
are the sole programmer working on a code base become problematic if you are
working in a largish team where not everyone might know everything that is
going on.

What I am trying to say is, IMHO Ruby "feels" like it is a good language to
teach in, but you can end up using it for a really long time without acquiring
programming discipline and fundamentals. If you start in C, for example, and
you have to mess with malloc and pointers and stuff like that, you are forced
to get that discipline, because otherwise stuff goes south really fast.
Someone who starts in Ruby might never really understand the performance cost
of initializing an object, or what the stack and heap is, or why it matters. I
think it would be better for a new programmer to start out in C or even
assembly for a little while, just to understand how a computer really works,
then it becomes so much easier to understand why languages like Ruby and
Python (and my personal favorite, Clojure) were created and how to use the
abstractions that these more modern languages provide.

~~~
dkarl
I like your point about C, and I have a related point: When beginners see
something that _looks_ or _reads_ like natural language, it activates the
wrong part of their brain, and they start to wonder why the computer doesn't
"understand" what they're telling it. The idea of a logical machine mindlessly
executing symbolic instructions comes pretty naturally for some people, but
for others, it's the first Big Idea in programming that they have to spend a
lot of time getting used to. Every time they read a line of Ruby code that
sounds kind of like stilted English, it causes a little regression in their
brain back towards the mental model of the computer as something intelligent
that understands the natural-language meanings of keywords and variable names.

Say what you want about BASIC, but programs like "10 ? "HELLO" 20 GOTO 10"
never let you forget you were dealing with a brainless, inflexible automaton.
Same thing with C.

With C, students are encouraged to ask the question, "What's really happening
inside the machine?" There are simple, concrete answers based on a simplified
view of the hardware. With Ruby, the answer to that question would be more
complicated, and more importantly, it would be less concrete -- it's a long,
long way to the hardware. C programmers start out grounded, at least grounded
in a simplified model of the machine that is consistent and often helpful. How
can beginners develop the habit of looking both up and down the ladder of
abstraction, if they're looking down from the mountaintop into a shapeless
gray cloudbank?

~~~
Xurinos
One thing I took away from my SICP experience was the idea that it is
languages/abstractions all the way down. You can play in a super high level
language like Lisp, or you can mold silicon. Somewhere you have to choose your
level of abstraction.

If I am teaching somebody how to think about an algorithm like sorting an
array, worrying about memory allocations is tangential, an inconvenient burden
C delivers unto us. If I want to teach them about working with memory,
pointers, and so forth, C is an excellent language for the job.

A crazy thing is happening here when we look down upon beginners who want
something that feels like a natural language. Why CAN'T we simply say, "I want
an app that opens two windows: one with a list of my music files, and the
other with cassette controls. Make it so."? Someone could develop a language
that does this; AppleTalk, for example, has some crazy high-level stuff
similar to this.

We have been mentally shackled by our languages. We think in them. It is
hubris to suggest that this the canonical way to think when programming, that,
for example, "i = i + 1" is the idiom for incrementing a variable.

I do believe, to be truly effective in our line of work, that our beginners
must eventually come to an understanding of the platform upon which they
develop. They must understand memory concepts. They must understand timing
concepts. They must understand multiprocessing concepts.

But there is no reason to ram C down some poor soul's throat as a first
experience. What a terrible language for teaching beginners, full of syntax
and slopped-together features. And what a great language to teach somebody in
order to prepare them for the industry.

~~~
dkarl
Ruby doesn't encourage you think in multiple layers of abstraction. The next
step down from C is a simple machine model with instructions, a CPU, and
memory, which is actually kind of helpful for writing code and understanding
performance. That layer is visible in the design of C. (It's an
oversimplification with no concept of caches, buses, instruction reordering,
etc., but those can be added to your mental model when desired.) What's the
next layer of abstraction down from Ruby? A byte machine and a garbage
collector, neither of which are evident in the language design (to a beginner)
and neither of which especially helpful to think about while writing Ruby
code.

As an abstraction, Ruby just isn't leaky enough. It's _too good_. It provides
no necessity and no reward for thinking about the next layer down. (Plus the
next layer down is pretty sophisticated, and pretty abstract from the
beginner's point of view. A beginner should know his computer has a CPU and
memory and machine instructions, but byte machines and garbage collectors?
They will be a mystery until later in the student's development.)

 _A crazy thing is happening here when we look down upon beginners who want
something that feels like a natural language. Why CAN'T we simply say, "I want
an app that opens two windows: one with a list of my music files, and the
other with cassette controls. Make it so."? Someone could develop a language
that does this; AppleTalk, for example, has some crazy high-level stuff
similar to this._

Natural language promises are a lie. When you understand the rigid logical
nature of a machine, it's fine. There's no disappointment or confusion when
the natural language model breaks; you just fall back from your natural
language mental model to your mental model of the machine as a logical
computing machine. You can do that because you already _have_ that model of
the computer as a mindless, mechanical machine.

What happens if you don't already have that mental model? What if your closest
familiar model for writing a program is writing instructions for another
person? When you don't yet understand the nature of the machine and the nature
of programming, natural language-like functionality intentionally misleads
you. It tries to hide the nature of the machine from you. It will occasionally
fail, and the cumulative effect of dealing with the occasional failures _may_
be that you come to understand that the rigid, uncomprehending behavior that
peeks through is the true nature of the machine, but why force a beginner to
deal with this now-you-see-it, now-you-don't approach to learning how
computers work?

Even worse, a beginner might continue to base his model of the computer on the
"normal" humanistic behavior and just write the "failures" off as mysterious,
uncharacteristic aberrations. When that happens, a programming language that
mimics natural languages is actually much worse for beginners than it is for
experienced programmers, because it encourages beginners to stick with a
mental model by which the machine will always be mysterious and inscrutable.

Ruby isn't that close to natural language, of course, but I think it's close
enough to occasionally revive a beginner's dominant, inappropriate mental
model and hold back their learning. I see that phenomenon in myself with the
"loop" macro in Common Lisp; lacking a precise understanding of loop, my first
impulse is to say things that sound right, write them as code, and then see
what happens. I would never actually get it right that way, of course -- it
would just be an exercise in random frustration unless I strictly copied a
known example -- but somehow it feels like the right thing to do, instead of
developing a valid understanding. When syntax has no pretension to correspond
to natural language syntax, it encourages a more appropriate mindset.

~~~
Xurinos

       Ruby doesn't encourage you think in multiple layers of abstraction.  The
       next step down from C is a simple machine model with instructions, a CPU,
       and memory, which is actually kind of helpful for writing code and
       understanding performance.
    
    

Vaguely. The "C is close to hardware" point has been well put to rest by folks
with better knowledge on the compiler intricacies than I. C gives you the
illusion of knowing what you are doing. Your mental model does not help you
much in C, at least not in modern C on modern architectures. The next step
down from C is object code generated for a specific OS in a specific memory
model with a spiderweb of links to other libraries. After that, the OS
processes that object code, sending instructions to your hardware, a
complicated machine model that includes instruction pipelining, memory pages,
and so forth, all of which are difficult to handle by hand without a clear
understanding of the interactions. You pointed that out, but you
oversimplified the issue.

Fortunately, we are rescued from having to have that much understanding by the
C language, which abstracts a lot of concepts nicely. I should point out that
printf(), the first output anyone learns in C, has its own language for
outputing in the form of format specifiers, far removed from direct machine
code. Which instruction do you use for "i = i * 2"? a multiplier or a shift?
Some developers rewrite malloc() because of various inefficiencies in the way
it manages memory, but the beginner can understand malloc() on a very simple
level and be successful.

The nice thing about ruby, python, perl, java, and many other languages is
that you do not have to worry about the memory drudgery. In all likelihood,
you are going to do it wrong anyway. If I am teaching beginners algorithms,
well, we can focus on memory management techniques later. We can still
understand that there is a computer with a CPU and some memory available to
us.

My point is that you do not have to understand all these layers of
abstraction. It is nice to know something of them later, and we are all better
developers for it. In fact, knowing a little about garbage collection of your
specific language can, for example, help you avoid circular reference memory
leaks.

    
    
       When syntax has no pretension to correspond to natural language syntax, it
       encourages a more appropriate mindset.
    
    

This is what I mean about being shackled by our languages. We come to believe
there is some "appropriate mindset". In fact, the language of choice creates
the mindset, and your approach to an algorithm will depend on your ability to
translate your will into that language's primitives, means of combination, and
means of abstraction. Functional languages are the latest fad in part because
programmers in general have rediscovered that method of thinking; as you put
it, they have had a "dominant, inappropriate mental model [that holds] back
their learning". Our beginners were shown the "i = i + 1" pattern and not
shown the "for my $i (1 .. 8) {}" pattern.

Sure, natural languages may never be natural enough; two individuals in the
same room do not necessarily speak the same English. You have to draw the line
somewhere. It just does not have to be at rock bottom, and we have continued
to evolve, making layer after layer on top of it. We are corresponding here on
a medium that is many levels abstracted from the silicon, and many a beginner
in web design can grasp plenty of HTML without understanding how a specific
browser is rendering it.

I would argue more optimistically that this is not a matter of somebody's
learning being held back. They learned and applied a method that is valid.
When that method became insufficient, bug-ridden, poorly optimized, whatever,
they learned or experimented with a new approach. This is not holding anyone
back in any meaningful sense; they are evolving.

------
z0r
The arguments are all very good, but I think I prefer Scheme as a first
programming language just because the syntax is easy to teach and you can get
to the good stuff much more quickly. Doesn't meet your 'real world'
requirement though - although, I think a few months spent learning the basics
in Scheme would make it easier to pick up subsequent languages rather than
vice versa.

~~~
steveklabnik
I think Scheme is a fine language, especially when paired with SICP. But I see
it as being something that's more of the leap from intermediate to master than
total-beginner to literate. I could certainly be wrong about this, however!

------
tel
Unrelated, but I often wonder why people insist on a program that prints a
string to the screen as the universal best first program. It's very practical,
but it's also _so_ tangential to what programming is.

Just delay compilation until the third week and you'll be able to start the
dialogue at objects, "is a", "has a", actions, properties. This is the
language we use to deal with the world already and though it's got funny
syntax attached, it's not a foreign concept to anyone. Talk about the runtime
system and it's expectations later.

(Edit: I think this is also why a lot of people hate the IO monad in Haskell
so much. Side effects are actually an advanced topic in pure FP so much of the
literature on IO is based around flaky metaphors and "don't worry about how
this works" just to get people printing things ASAP. If you just use the REPL
and don't worry about implementing it then you can build up to understanding
the interaction between IO and the RTS in a sensible way.)

~~~
endtime
>Unrelated, but I often wonder why people insist on a program that prints a
string to the screen as the universal best first program. It's very practical,
but it's also so tangential to what programming is.

The point is that the first step is getting your environment set up so that
you can write[, compile,] and execute code. Printing a string is just a
conveniently trivial program.

~~~
tel
It's true that that's the first step to programming as a craft, but it's
_hardly_ the first step to programming as a way of thinking, managing
complexity, and building abstraction.

That's like saying the first step to flying an airplane is turning the
ignition. Yes, that's true, but without a firm idea of what's going to be
happening after that point you're only going to hurt yourself by turning that
key. It's something that comes to head when learning to fly a plane/program.

~~~
Avshalom
_It's true that that's the first step to programming as a craft, but it's
hardly the first step to programming as a way of thinking, managing
complexity, and building abstraction._

But programming _is_ a craft, so how is the first step tangential.

------
neovive
I'd like to highlight your mention of Scratch. I agree that Ruby is an
excellent language for introducing older students to programming, but Scratch
is truly exceptional for younger students. Scratch excels at introducing
fundamental programming concepts,while hiding syntax and language subtleties
for beginners. The puzzle/block metaphor of Scratch really works for
"building" a fully-functional program. For those that have never worked with
Scratch, the underlying Squeak(Smalltalk) implementation is completely hidden.

I guess the choice of language really comes down to the audience and age
group. It's unlikely that colleges will ever utilize Scratch in their
introduction to programming courses, thus Ruby is an excellent choice. But
elementary and middle school students would benefit greatly from learning
programming concepts such as logic, control structures, iteration, etc. from a
tool like Scratch.

------
binarysoul
I can't really ready your post because of the background. I'd recommend
changing it

~~~
stevelosh
Safari Reader (or the Chrome/FF extensions like it, or the readability
bookmarklet) fix it nicely (and many other sites like it).

------
jrheard
Background/disclaimer: I learned Ruby first (largely thanks to _why's guide),
but have been using Python at my job and in my personal projects for about a
year and a half now, and have grown to prefer Python.

The main gripe that I have with using Ruby as a teaching language is that the
whole magic thing. You address this really well, and I really like your
passage on abstractions, but I don't agree that the magic is just limited to
advanced usage of the language.

Here's my gripe in its most pure form: in Python, in order to figure out _all_
of my capabilities _right now_ (short of what things I can import), I can use
dir(object_i_want_to_know_about) and see everything that object can do. I've
got a friend who's learning Ruby, and we wanted to find a built-in String
method to see if a string starts with a prefix. We found it after Googling,
but I wanted to find out about it in the context of the REPL. I figured that
Ruby must have similar functionality to Python's dir(), but as far as I can
tell, the only way to achieve that is to hack the REPL in a very minor way
(you can add a few lines to your .irbc that will achieve this). There doesn't
seem to be anything like dir() built into the language, and so it's a lot
harder to figure out what you can do at any given point in time without
Googling.

I think that this difference speaks volumes about the difference in
philosophies between the two languages, and is in large part why when I
introduce friends to programming these days, I tend to do it via Python, in
particular via Zed's book.

I haven't used Ruby much in the past year or two, though, so if I'm completely
mistaken about my assertions here, I'd love to find out about ways in which I
can reason about the options available to me when I'm hacking around in irb.

PS: I really liked the code example near the end of the article.

~~~
gmac
_I figured that Ruby must have similar functionality to Python's dir()_

Sounds like you wanted

    
    
       'x'.methods
        => [:<=>, :==, :===, :eql?, :hash, :casecmp, ... ]
    

or the more focused

    
    
       'x'.methods - Object.methods
        => [:casecmp, :+, :*, :%, :[], ... ]
    
    ?

~~~
jrheard
Yup - that's exactly what I was looking for. Awesome.

~~~
rb2k_
The question is why Python calls it "dir" when you just want to see a list of
the object's methods...

------
horofox
Ruby is awesome.

Please don't compare it with java, that makes me so sad.

Java is so bad, deal with it. The good thing about java is the JVM to use with
another language like scala, clojure and so on.

Being a programmer is actually funny, you can stay all your life using 10
frameworks to do something for web in java with a lot of lines of code and
bugs and think you are so good(like I did), but you're actually so bad.

I'm still trying to learn lisp and start doing some clojure too, can't wait!

~~~
steveklabnik
> Please don't compare it with java, that makes me so sad.

I only do because Java has been the go-to year one language for colleges for
the past few years. I wouldn't even mention it otherwise.

> I'm still trying to learn lisp and start doing some clojure too, can't wait!

Enjoy! I don't have a ton of Clojure experience, but from what I've seen, it's
pretty awesome.

------
steveklabnik
This post was a lot of work, and I'd love to hear your thoughts, HN. Please
tear me apart. :)

~~~
michaelchisari
It seems a lot of the "magic" with Ruby comes from Rails, and I would say that
teaching a new language using a framework is probably going to be pretty
overwhelming to any beginner.

One thing I would worry about with Ruby as a teaching language is that the
syntax is somewhat unique. I don't think this is a terrible thing, the basic
structure is similar once you get past the syntax, but I think it can trip
people up if it's their first language, and then they move on to
Java/C++/Perl/Python/PHP which seem to follow more of a continuity.

I think, from my experience, I actually think learning a couple different
languages at once can be really advantageous. I pretty much learned Pascal, C
and Assembly at the same time, working in the demoscene, and linking them all
together.

The important part of programming isn't the syntax, it's the structure and the
concepts. You can always look in the manual for syntax, but structure and
concepts are the hardest parts to grok.

So maybe my suggestion is to have side-notes for the examples that say,
"here's how you do this in PHP, here's how you do it in Java, here's a
breakdown of the differences and similarities."

On the other hand, I may be completely wrong, because I'm biased based on how
I learned programming, and not everyone learns that way.

With all that said, this is a pretty great piece.

~~~
SlyShy
As a TA for an intro to programming class taught using Java, I wish, wish,
wish we were using Ruby instead. I spend 70% of my time teaching people Java's
obtuse syntax, rather than teaching the underlying concepts. Once a programmer
has developed the concepts, she can code in nearly any language, but she
really has to learn the concepts. Ruby's syntax is straightforward enough that
little kids get it down in days, so students can actually learn computer
science rather than Java science.

~~~
mike-cardwell
It depends what you're learning. If all you're learning is to memorise blocks
of code, then Ruby is probably easier to learn because there's less of it. If
you're trying to learn what a programming language is actually doing, I don't
think Java is any more difficult than Ruby. If anything, most Java code is a
lot more clear about exactly what is going on.

FWIW, I learnt and tutored Java at University and have recently been learning
Ruby and Ruby on Rails for fun.

~~~
SlyShy
Learning what a programming language is actually doing is good, but arguably
that is a different course than an intro to CS course.

One big advantage Ruby has over Java is that irb is simply a snap to use.

------
poink
I make a living with Ruby and Rails, but I'd stick with JavaScript if I were
learning today. Server-side JS is a LOT of fun.

