
Ending the Era of Patronizing Language Design - raganwald
http://blog.objectmentor.com/articles/2009/07/13/ending-the-era-of-patronizing-language-design
======
okmjuhb
This article has so much crazy that I almost don't know how to respond to it.

C++ coddles programmers and Ruby doesn't. Really? That sounds like a sensible
thing to say?

C++ avoided reflection because it would be misused by programmers? Only a
profound misunderstanding of the design goals of C++, or having never actually
used it, could inspire such a claim.

ActiveRecord is a Rails innovation that took decades to realize? What? Why
would you think this?

------
Artemidoros
_So, why aren’t more people crashing and burning?_

because:

a) You're awesome, as long as you're using Ruby no matter how ridiculous you
look when seen without rainbow colored glasses -
<http://www.infoq.com/presentations/ford-large-rails>

b) The "you're doing it wrong crowd" will shout you down the moment you admit
having troubles - [http://glu.ttono.us/articles/2007/04/15/on-twitter-rails-
and...](http://glu.ttono.us/articles/2007/04/15/on-twitter-rails-and-
community)

c) Most failed Ruby (which in all fairness usually means Rails) projects never
gain any publicity.

d) Ruby projects coming close to the gargantuan scope of some of the bigger
Java projects are (thankfully) exceedingly rare (if not non existent).

Lastly I had no clue that Saeed al-Sahaf is now working for Object Mentor,
awesome!

Disclaimer: I worked for 3 years as a Ruby developer, including on one of the
bigger (> 50.000 loc) Rails projects out there and I encountered some crashing
and burning myself (I know I did it wrong).

------
gfodor
The author acts as though languages like Ruby, which defers to the programmer
to not screw himself, are a new thing. Just because the "enterprise" has been
trapped with C++, C#, and Java for the last two decades says less about
language designers in general (as the author posits here) and more about the
thinking of large organizations in choosing languages to build their software
in.

------
btilly
He admits that he doesn't know Ruby deeply, but then he goes on to assert that
Ruby people don't seem to create messes that paint themselves into corners.

From what I've read plenty of people have done just that in Ruby because of
monkeypatching.

~~~
swombat
I've been coding in Ruby/Rails for 3 years now, and been active in the
community, and I haven't yet done that or encountered anyone who had done
that. People are warned that monkey-patching is dangerous, they learn how to
monkey-patch responsibly, and they act accordingly and responsibly - if
something does go wrong with monkey-patching (which, as I've said, seems not
to happen), the person who did the patching is aware of why things are going
wrong, and doesn't blame it on the language - I can't remember reading a
single query in, say, #rubyonrails, where the problem was due to monkey-
patching.

~~~
Artemidoros
_I've been coding in Ruby/Rails for 3 years now, and been active in the
community, and I haven't yet done that or encountered anyone who had done
that._

Than you seem to have had much more luck than I did.

~~~
hernan7
Grammar nitpick: That would be "then", not "than" (this particular error never
fails to gum up my non-native English spearker's parser).

~~~
lmkg
Nitpick nitpick: This statement refers to the first "than," not the second,
which is used correctly.

------
ccc3
Another problem I see with creating artificial barriers is the demoralizing
effect in can have on someone trying to learn a language. Whenever I'm
prevented from doing something because some committee thought it was too
complex for me, I begin questioning whether I'm using the right tool. Who
knows what other random rules are going to jump out and bite me later.

I would much rather do something the wrong way and be burned by it (and
probably learn a lot in the process), than be prohibited outright from using
certain techniques.

------
Rickasaurus
I like the idea of having the freedom to do everything but also having
intelligent defaults to nudge the programmers in the right direction. Optional
immutability is a great example of this. When you make something the default,
you send the message "this is usually the right way to go about things".

~~~
prodigal_erik
When immutability is optional, you can no longer rely on it to reason about
the program. Specifically, you have to verify that any two pieces of code
you're worried about actually don't interact, because you no longer have a
guarantee that they can't. I think a lot of restrictions are like this.

Edit: To quote a great comment on the article,

> Invariants make software more reliable. Weaker languages have stronger
> invariants. There’s less stuff to test.

------
barrkel
From the outset, I'm aware that programmers who haven't tried to design
programming languages will likely disagree with me; but I'll express what I've
learned anyway.

The truth, from the language designer and compiler author's perspective:
programmers should be protected from their own worst instincts, and
programming languages that do this well are of a higher quality than languages
which don't. Patronizing, to a point, is good. The flip side is that features
whose use should be encouraged should be included in the box, turned on by
default, made easy to use, etc.

Ruby is patronizing because it's memory safe. C and C++ programmers don't like
this kind of patronization. Ruby, and Python, are unlike C and C++ in this
way: they are rigid and uncompromising in their type safety, disallowing
unsafe casts and operations.

Michael makes some category errors in his rant. For example, he says:

 _"[...] we’ve made the assumption that some language features are just too
powerful for the typical developer -- too prone to misuse. C++ avoided
reflection"_

The problem with this abutment is that C++ avoided reflection in its quest for
power (or what C++ users thought was power), rather than its avoidance of
power: C++ users wanted to hew to a philosophy of pay only for what you need.
The problem with that philosophy is that if you make certain features
optional, or pay as you go, then the community in general cannot assume that
everyone is using available features. Instead, third-party library authors
must assume that only the lowest common denominator set of features is
available if they want to maximize usage.

Java and C# have avoided multiple inheritance for good reason; and Michael's
rant is not a good reason for them to reintroduce that feature, because it
remains problematic.

 _"The newer breed of languages puts responsibility back on the developer"_

This is simply untrue, and bizarrely myopic, in my view. The developer always
has responsibility, but the responsibility has shifted to different places, as
the emphasis on abstraction level in programming languages has shifted. To
take Michael's thesis at face value, you would need to believe that C
mollycoddled the user and didn't bite their fingers off when they didn't take
care of their pointers, or carefully handle their strings, or foolishly used
fixed-size buffers for dynamic input.

Of course C put responsibility in the hands of the developer for these things.
But guess what? Ruby, Python, C#, Java etc. all take away responsibility from
the developer for these things! Michael says that dynamic languages like Ruby
hand over the power of monkey-patching etc. to the developer, and that this is
a new development; but to get equivalent power of dynamic behaviour overriding
in a C application, you'd be using very abstract structures, likely looking
like a Ruby or Python interpreter behind the scenes, where you would have a
similar degree of responsibility. But not only that; you'd also be responsible
for the dynamic runtime, as well as the memory management and all the other
unsafe stuff that comes with C.

~~~
j_baker
Interesting points, but I think you're bordering on straw manning the author.
I'll grant you that maybe C++ is about as bad an example as you can get of
putting responsibility in the hands of the programmer.

In fact, it was behind a lot of Java's decision to take that power _out_ of
the developers' hands. I would argue that Java was an overreaction. It
corrected the safety issues, but also removed a lot of the ability to create
abstractions. Want a global variable? Those are bad, don't use them. Want
multiple inheritance? That's bad, don't use it. Want operator overloading?
That's bad, don't use it.

The problem is that the fact that those things are bad in some cases (even
arguably most cases), doesn't mean they should be forbidden. I know what I'm
doing. If I want to use a global variable I should be able to. I know what I'm
writing. The language designer doesn't.

In other words, to make a long story short: I don't think he was trying to say
that responsibility needs to be in the developers' hands in _all_ cases. But I
think that he makes a valid case that newer languages are right in moving some
of that responsibility back to the programmer.

~~~
jmillikin
Java supports global variables just fine -- in fact, most Java code uses far
too many globals for my taste.

Multiple inheritance and operator overloading aren't _bad_ , but at the time
Java was designed, nobody had any idea how to use them properly. Python 2 gets
MI right (though Py3 fucks it up a bit with magical super()), and Haskell
provides a clean implementation of overloading, but it'll be years before
these improvements filter through to Java (if they ever do).

~~~
cabalamat
> _Multiple inheritance and operator overloading aren't bad, but at the time
> Java was designed, nobody had any idea how to use them properly._

Really? I was coding C++ using multiple inheritance and operator overloading
around then, and AFAICT was using them properly.

~~~
jmillikin
Sorry; I meant language designers didn't know how to use them. C++ is a
perfect example of operator overloading gone horribly wrong.

------
Zarkonnen
The problem with saying "you can do things in whatever way you want in Ruby"
is that the moment you have more than one developer it collides rather
drastically with the "principle of least astonishment". This is not to say
that making everything out of eg Java boilerplate is the solution. The key is
writing well-behaved and consistent code, which you can do in most languages.

~~~
raganwald
The argument I take is that while you can write well-behaved and consistent
code in any language, there is no language that forces you to do so. So
picking a language on the basis that it will force/encourage people to write
well-behaved and consistent code--even if they don't want to/don't know how
to/don't give a toss because they're a cheap contractor six time zones away--
is a broken choice.

~~~
barrkel
Programming languages are UIs. Your assertion is tantamount to saying that
there is no such thing as a bad UI; that a choice between UIs is a broken
choice, because you can still get your job done no matter what UI is there, or
similarly you can still do a crappy job no matter what UI you use.

But this is just wrong-headed thinking, to my mind. UI definitely influences
user behaviour. One thing currently in vogue is game mechanics; the whole
point of using game mechanics in your UI is to encourage certain kinds of
behaviours, and it works, dagnabbit! Most UIs that have any coherence to them
at all have a way of working with their grain, and many more ways of working
against their grain. If they're good UIs, working with the grain ought to
reduce your non-essential work load and make working a pleasure. If they're
poor, you'll have to work against the grain, fighting the design, to get your
job done.

Programming languages are directly analogous to this.

~~~
raganwald
I agree with everything you're saying about programming languages having a
grain, and of making some things easy and other things hard. That's the
general case.

However, the two specifics we're talking about are well-behaved code and
consistent code. Most language in theory make it ridiculously easy to re-use
code, to make code DRY. Languages like Ruby actually give you more tools for
making code consistent than languages like Java.

Yet, there is a lot of awful Ruby code and some elegant Java code. I've seen
some amazing assembler code. It seems that somehow languages don't seem to
have a lot of influence over what we might call architecture.

Perhaps the issue is that languages are hand tools, but programs are
cathedrals.

p.s. If I had to bet, I'd say there is more of a correlation between languages
and architecture than a causal relationship. Paul wrote about this some time
ago: <http://www.paulgraham.com/pypar.html>

~~~
chromatic
_It seems that somehow languages don't seem to have a lot of influence over
what we might call architecture._

My instinct suggests instead that languages differ as to the standard
deviation of quality of code written in them.

------
InclinedPlane
I think the best comment on this comes from Glenn Vanderburg:

[http://www.vanderburg.org/Blog/Software/Development/sharp_an...](http://www.vanderburg.org/Blog/Software/Development/sharp_and_blunt.rdoc)

The money quote here is this:

 _"Weak developers will move heaven and earth to do the wrong thing. You can’t
limit the damage they do by locking up the sharp tools. They’ll just swing the
blunt tools harder."_

Which is so very, very true in my experience.

Designing a language around the idea of protecting weak developers from bad
choices is a recipe for failure and mediocrity. Instead, look toward designing
in a way that guides experienced or at least thoughtful developers toward
greater success.

tl;dr: Don't make bumper cars for people who can't be trusted to drive, make
nomex suits and roll-cages for race car drivers.

------
jemfinch
It's not about being patronizing. It's about recognizing our human
limitations: "As a slow-witted human being I have a very small head and I had
better learn to live with it and to respect my limitations and give them full
credit, rather than try to ignore them, for the latter vain effort will be
punished by failure." (Dijkstra, EWD249)

~~~
pmccool
I understood Dijkstra to be arguing for simple languages, being more
susceptible to formal proof, easier to understand, etc.

Langauges like, say, C# are complicated _and_ patronising. That's the sort of
language I thought the article was comparing Ruby with.

~~~
jemfinch
> I understood Dijkstra to be arguing for simple languages, being more
> susceptible to formal proof, easier to understand, etc.

Yes, he was. In a way, though, isn't Dijkstra's sort of "designing for
simplicity" and even the goal of making a language "easier to understand" the
very definition of "patronizing"?

~~~
pmccool
I don't think so. For one thing, he wasn't just driving at "easier to
understand", he was also after "easier to formally prove correct", which isn't
at all patronising.

I also don't find a goal of "easier to reason about" particularly patronising;
it makes the rather un-patronising assumption that the target audience is
willing and able to reason about the matter at hand.

The language features I find patronising are the ones that make it hard to do
things a certain way, regardless of how carefully considered my reasons are.

~~~
jemfinch
> The language features I find patronising are the ones that make it hard to
> do things a certain way, regardless of how carefully considered my reasons
> are.

Any in particular that you have in mind? Historically, many people have had
very carefully considered reasons for using features like goto, type puns,
etc. but many languages exclude them as problematic. Is it patronizing to
exclude such features even from the toolbox of programmers who carefully
consider their use?

~~~
pmccool
My current least favourite patronising language feature: in C#, you can only
override a method if the base class lets you.

In answer to your second question, I think it depends. Including gotos in a
language is problematic for reasons other than the fact that programmers tend
to misuse them. I think it comes down to the rationale. With the C# feature I
mentioned, the stated reason was, inter alia, that programmers tended to
misuse inheritance. I found that patronsing. That leaving out goto makes it
easier to formally prove a language correct I do not. Leaving out goto because
programmers tend to misuse it, that I would find patronising.

~~~
jemfinch
I apologize for not having the time to make this comment appear less like
interrogation and more like a discussion ("if I had time, I would have written
a shorter letter"), but I'm in a hurry :) Rest assured, I'm not arguing with
you, but genuinely interested in your responses.

> My current least favourite patronising language feature: in C#, you can only
> override a method if the base class lets you.

My recollection from the design of Java is that such "finality" is necessary
to allow the type system to reject code that attempts to override security-
sensitive methods: is that not still true of C#?

> I think it comes down to the rationale. With the C# feature I mentioned, the
> stated reason was, inter alia, that programmers tended to misuse
> inheritance. I found that patronsing.

What of a language that eschews inheritance entirely, both because it makes
reasoning about software more difficult and because the vast majority of
programmers cannot use it correctly?

"Patronizing" seems to indicate that a designer considered himself smart
enough to use a feature, but determined his "subjects" were not. What of
designers who recognize their own limitations and remove features they know to
be error prone in their own practice of programming? Is that still
patronizing, or does it deserve a different descriptor?

~~~
pmccool
> My recollection from the design of Java is that such "finality" is necessary
> to allow the type system to reject code that attempts to override security-
> sensitive methods: is that not still true of C#?

That's still true. The "feature" I has in mind is that methods in C# are not
virtual by default. This is in direct contrast to Java where they are. This
interview has a discussion on the motivation behind this (IMO incredibly
tiresome) feature: <http://www.artima.com/intv/nonvirtual.html>.

My view is that the real issue is that inheritance is all too often used when
composition is more appropriate, which problem this change does nothing to
help.

> What of a language that eschews inheritance entirely, both because it makes
> reasoning about software more difficult and because the vast majority of
> programmers cannot use it correctly?

If it only makes the language better for "bad" programmers, I would consider
it patronising. It also comes down to the intent of the language designer, by
definition.

> "Patronizing" seems to indicate that a designer considered himself smart
> enough to use a feature, but determined his "subjects" were not. What of
> designers who recognize their own limitations and remove features they know
> to be error prone in their own practice of programming? Is that still
> patronizing, or does it deserve a different descriptor?

I think that's worthy of a different descriptor. A failure to anticipate the
needs of programmers who are smarter or more disciplined or whatever - which
is what such an omission seems like to me - may be a problem, but I don't see
it as patronising.

------
jgg
I know everyone's read it 100 times already, but:

 _Like the creators of sitcoms or junk food or package tours, Java's designers
were consciously designing a product for people not as smart as them.
Historically, languages designed for other people to use have been bad: Cobol,
PL/I, Pascal, Ada, C++. The good languages have been those that were designed
for their own creators: C, Perl, Smalltalk, Lisp._

From here: <http://www.paulgraham.com/javacover.html>

~~~
kenjackson
Fun quote, but factually incorrect. For example, who was the first user of
C++? Bjarne himself. He needed the power of Simula, but at the time Simula
implementations didn't scale. He took the ideas he found useful in the
language and put some into C, as he was building large scale C/BCPL
applications.

To me that's the definition of building a language for yourself. You actually
write a language that you need to solve a specific problem you have.

And I find his categorization of good vs bad languages somewhat absurd. What
actually makes Ada, C++, and Pascal bad? His lack of understanding? What makes
C, Perl, and Lisp good? The inverse?

While I have my own personal prefernces with respect to languages, I fully
believe a large part of it is familiarity. I've NEVER met anyone who could
argue why a particular language was truly bad. They usually are just
passionately arguing religion, and I find that tiring. It was cute when I was
in high school, but those debates are getting tired.

~~~
jemfinch
Also, according to Kay's "The Early History of Smalltalk" [1], Smalltalk was
very much designed with end users in mind.

[1] <http://gagne.homedns.org/~tgagne/contrib/EarlyHistoryST.html>

------
chipsy
You have to patronize at least a little bit to write something that is more
than a glorified assembly language - otherwise you have no basis to build your
other abstractions on.

With C, for example, there's a predefined model for the callstack based around
having a fixed set of arguments for function calls and singular return values.

Forth, on the other hand, treats words as simple nested subroutines, keeps
data on a global stack, and lets the data carry over from one word to the
next. No explicit arguments or return values are necessary.

C can be characterized as "safe," while Forth is "flexible," depending on your
use case, but in terms of pure performance both models have strengths and
weaknesses.

