

Ending the Era of Patronizing Language Design - fogus
http://blog.objectmentor.com/articles/2009/07/13/ending-the-era-of-patronizing-language-design

======
snprbob86
Our team develops a .NET class library. As a game platform, we have the luxury
of breaking binary and source compatability between releases because games are
atypically short-lived applications. We provide side-by-side frameworks on
both Windows and Xbox, so you don't need to upgrade to the latest framework
before you release your game. But that won't stop people from doing it and
still complaining.

When Microsoft releases a new version of a library, some subset of users
upgrade instantly. They expect that all of their code to keep working. When we
break source compatability (something most teams NEVER do), we do it very
carefully. We only break things where there is a clear, large win and the
upgrade path is straightforward. Customers have complained when we do this
every single time because it is not what they inherently expect.

So now back to the original topic at hand: the "After all, we're all
consenting adults here." mentality. Yeah, that's great and all, but it
requires an incredible amount of discipline. See, for example, how the Django
project handles this:
<[http://code.djangoproject.com/wiki/BackwardsIncompatibleChan...](http://code.djangoproject.com/wiki/BackwardsIncompatibleChanges>).
Personally, I love doing things this way. However, consider a large non-
software company. They created some internal app and ran into a small bug or
shortcoming of a class library. They find out that the new version fixes their
problem, so they they upgrade. BAMN. 10 other things break. Ugh, oh. that's
not good. They already laid off the dev team that made this program, they have
one vendor making a couple quick fixes to handle their new 2010 business
rules. This used to work, why are you breaking me? I just want that one new
fix!

In order to prevent situations like this, you need to carefully monitor your
public surface area. This is why the _foo naming convention exists in Python.
It lets you say "We explicitly do not promise not to change this." Believe me,
if you make a type public or a method virtual, some one will instantiate it or
override it. I've seen it many times. If you didn't explicitly plan for them
to do that, you've still got a backwards compatability bug should you change
it. Even if you have it private, someone might call it with reflection and
STILL get annoyed when you break them, although this is far more rare.

So in summary: this cultural distinction is a meaningful difference in
requirements between open source developers and commercial developers. The
respective languages have evolved to meet the needs of their respective
customers. One is not necessarily better than the other.

This is why C# 3 and Python are tied for my favorite language :-)

------
prodigal_erik
There's value in being able to tell the system "I do not want to do X. If I
seem to be doing X, it's a mistake, so please stop me." People really do make
mistakes, and there's nothing condescending about building a system to
accommodate that. That's why my car's steering wheel has an airbag rather than
a punji stick.

~~~
timwiseman
You are right and I agree with you, but I believe the point of the blog post
was about the language designer saying "You cannot do this because it is too
dangerous."

That is different than an application designer saying "Within the confines of
this application, I will never do this so flag and prevent if it seems like it
is."

~~~
Retric
I don't think that's really valid. How many recent languages expose raw memory
access? The trick to building a great computer system / language is exposing
the aspects that are most useful while hiding everything else.

~~~
bodhi
But the aspects that are most useful to you are _different_ to the aspects
that are most useful to me. Recently I've been trying to write a wrapper for a
C library in a couple of different languages, and raw memory access would have
made my life much easier. But the languages didn't have it, making my work
more difficult.

I think the (paraphrased) aphorism "Make the common, simple. Make the hard,
possible" is apt.

~~~
Retric
_the aspects that are most useful to you are different to the aspects that are
most useful to me._ Which is why we have so many languages. Also, we don't
really need another C or Pearl at this point, but new domains do show up. Such
as GPU's with insane amounts of parallelism.

------
lallysingh
"The fact of the matter is this: it is possible to create a mess in every
language. Language designers can’t prevent it."

I hate stuff like this. It's not a bool, it's a float. Language features (e.g.
type checking) help eliminate some errors, reduce the likelihood/frequency of
more, and reduce the cost of finding others. Coercing reduction factors into
absolutes destroys the conversation for a cheap debate point.

------
noamsml
> C++ avoided reflection

Not really, its architecture simply didn't really support reflection. C++
actually lets you do almost anything imaginable and then some.

Overall, though, I agree with the premise in part. Certain languages (Java,
C#, and to a much lesser extent Python[1]) try to remove features they
consider "evil", rather than let the developer have all the rope to tie
themselves. What they fail to account for is that we're _developers_ , we
should be competent enough to know the dangers of certain features and we
should be competent enough to avoid them if they're not needed.

Languages that let you do whatever the hell you want aren't a new thing,
though. C, lisp, etc have existed for awhile. If at all, the idea of
patronizing programmers is new.

[1] Guido van rossum has said many times that there should be only one way to
do anything, and I think you see that with the crippling of lambda in Python
(Guido favors named subfunctions). However, Python still has many "dangerous"
features if you need them.

~~~
blasdel
lambda being crippled in Python is the same story as with C++ and reflection.

It's pretty simple: Python's syntax doesn't let you have a statement inside a
statement. Even if it did, handling blocks would be a nightmare!

~~~
noamsml
I see your point, and it's a valid one, but then again, Boo managed to work
around it.

------
scott_s
C++ does not omit features because they were worried that programmers would
misuse them. This is, afterall, a language with multiple inheritance, allows
arbitrary casts and allows programmers to grab a pointer to any arbitrary
piece of memory. Stroustrup has said in many places that, in the end, he
trusts the programmer and is in favor of including useful features despite
their danger.

Full reflection doesn't exist in C++, I assume, because of the runtime
overhead costs. The amount of reflection that is present through Runttime Type
Information (RTTI) does add overhead, so it's only compiled in when
programmers use those features.

Further, I'm not a Ruby programmer, but from reading HN I know there's
significant discussion in the Ruby community about monkey patching arbitrary
classes. The consensus I've seen is that it, in general, it's a bad idea. This
is the same kind of cultural proscriptive advice the author says is not
present in the Ruby culture.

~~~
jerf
Don't mistake HN "consensus" with Ruby community consensus. I've been very
loud on HN about how monkeypatching is a bad idea, but I am not a member of
the Ruby community. (I got my burned-by-monkeypatching credentials from
Javascript.)

I never was really in the market for Ruby since I know Python very well, but
certainly Ruby's love affair with monkeypatching would be sufficient by itself
to turn me away.

~~~
DanielBMarkham
_I got my burned-by-monkeypatching credentials from Javascript_

Ouch. I cringe just thinking about it.

(In Javascript you can do some truly ugly and unnatural things mixing up what
people expect and what they get.)

------
zmimon
This way of thinking about language constraints misses a key point: language
constraints are not only there just to bug the programmer into behaving
better. They also provide a way of reasoning about the program to a person
trying to understand it. The guarantees that exist in, say Java code, mean
that a new person can say with much better precision what any given piece of
code does (not what it's meant to do - what it will actually do). In a dynamic
language, it might do anything.

I experience this exactly in the dual Groovy-Java universe I inhabit these
days. I am faster at writing the groovy code and I think the end result looks
nicer and smaller. But when there's some strange problem that needs to be
debugged that I don't understand, the java code is much much easier to reason
about. Simple questions like "how is this function possibly getting called"
can be answered definitively in a fraction of a second with a key-press in
Eclipse for java, while I have to run a virtual simulation of the program in
my head to figure out what the groovy code might do.

------
caffeine
Doh ... "the newer breed of languages". Like Lisp?

~~~
jimbokun
Lisp has been one of "the newer breed of languages" for over 50 years now, and
that run does not seem likely to end anytime soon.

~~~
prodigal_erik
"Lisp doesn't look any deader than usual to me." \- David Thornley

~~~
Hexstream
"Lisp is not dead, it just smells funny."

