
Why MIT uses Python instead of Scheme for its undergraduate CS program (2009) - bb88
https://cemerick.com/2009/03/24/why-mit-now-uses-python-instead-of-scheme-for-its-undergraduate-cs-program/
======
ng12
> However, he did say that starting off with python makes an undergraduate’s
> initial experiences maximally productive in the current environment.

I read this as "the parents paying tuition wanted their kids to learn
practical skills so they can make money".

I learned Scheme as a first language and I will be forever grateful for it.

~~~
curun1r
Same here. When I learned programming at UC Berkeley in the 90s, Scheme was
the intro language. It was perfect precisely because it was so far from being
the kind of language you'd use to engineer large projects. The first day, the
professor spent less than half of the period introducing the entire syntax of
the language...it's so damn simple. Beyond that, all learning was abstraction,
problem solving and CS concepts. It was learning how to take a very limited
set of primitives and construct a larger whole.

I pity the MIT students that will waste so much of their intro course learning
what is a vastly more complex language. There will be so many students that
will struggle to learn python-specific topics that are irrelevant to learning
the act of coding and may decide that CS and programming just aren't for them.
Scheme is such a perfect language for this task because it puts so little
between the student and the crucial experimentation phase that can instill a
fascination/curiosity with programming. And now MIT students will miss out on
that.

~~~
habitue
We also live in an era where it's basically impossible that kids getting into
MIT's CS program haven't been exposed to programming before. I think it's a
romantic idea, but the fact is college isn't where you learn to program, it's
where you learn the time complexity of btrees.

~~~
Zanni
This is a really significant point. I mean, I love SICP. I think Scheme is
elegant as hell. But I learned to program with Basic in high school, and SICP
in Scheme would have _seemed like_ a huge step backward. I know Fortran did,
which was my first college-level experience. I can appreciate Scheme now, but
I don't think I would have then. And I can only imagine how "kids today," who
have grown up on real languages would feel ("real" in contrast to Basic, not
Scheme).

------
blhack
I couldn't be more thankful that python was one of my first languages, and
this is something that I've passed on to everybody I've ever mentored.

Python is a real language that does real things, but it's also completely
approachable to a newbie.

If programming seems hard, then you have a bad teacher that is probably just
trying to evangelize their hobby to you.

And god help anybody that tries to start with JavaScript.

~~~
_hardwaregeek
I don't get why people hate on starting with JavaScript. Since ES6, JS strikes
a really nice balance between practical usage and theoretical value. Like
Scheme it's a dynamically typed language focused around a single data
structure (list for Scheme, object for JS), with first class functions. Sure,
it has weak typing and there's some scoping complexity. But, that's a
completely reasonable tradeoff for being one of the most useful languages in
the entire world. Not to mention, JS's weird casting issues are not as common
as people make them out to be and const/let solves a lot of the scoping
problems.

~~~
gronne
I disagree completely - javascript encourages shitty programming by design and
by culture. If you don’t expose the young chances are they won’t get infected.

~~~
halfastack
Because all of those who started on something like C, C++, or Java have way
better programming practices? My experience tells me that starting language is
literally not a vector in determining how good of a programmer someone is.

~~~
gronne
I agree. Maybe its the barrier to entry for publishing that has gotten a lot
lower and thus pushing down the quality of public code (which is natural). Or
maybe im just worried about everything being pushed to the browser.

------
_chris_
10 years ago; I'd really like to hear a post-mortem of this decision.

For context, MIT's previous EECS curriculum began with 4 required courses for
all EE and CS majors. This meant that EEs had to suffer through Scheme, and
CS'ers had to suffer through circuits. I'm sure everyone has their own opinion
on whether this is a good thing or a bad thing.

Regardless, the EECS department switched to a new 7(?) "core" class format in
which students choose 4(?), giving students the freedom to take more relevant
courses to their interests/major. In this switch, 6.001 (scheme) was out and
replaced with a more hands-on robotics-based Python lab class.

I can't really say which approach is better, but I do think there's an
important place for the more pure "thinking" computer science classes too,
even if it's not the first class taught to freshmen.

~~~
brians
I hire people coming from this program. I can no longer rely on students
understanding control systems, having read Leveson's therac paper, having
worked with a notebook-style REPL (I.e., something stressful to model exactly
in your head, promoting confusion, leading to notes and comments and
structure.)

The near-total demise of Athena matters here too. I can and do hire people
with all those skills! It’s still very possible to get them at MIT. It’s just
not nearly the default, and since it’s way up in electives I’m at least as
likely to find the best people carrying a degree that says 21W as VI-3.

I don’t envy the VI admins. They have a problem akin to that of a suddenly
virally successful startup. 400 people a year come in; about that many better
succeed.

------
alexashka
As someone who's read SICP when first starting out and re-reading it and re-
watching the lectures much much later - I can say that 95% of the value of
SICP flew completely over my head when I was first learning.

I remembered catch phrases like 'assignment is bad', only to be led down the
'functional programming' cult years later because it reminded me of sicp.

Building up entire systems from first principles is quite beautiful and
illuminating - once you've used existing systems and began to wonder how they
came to be.

For most people - I don't think they're really that curious. University is not
structured to give you the time to go deep or be curious - you're busy doing
assignments, readings and studying for tests. SICP is a gem for those who are
lucky to have a few months to really go down the rabbit hole. For everyone
else, it'll likely feel like needless torture or 'wow, that's cool and I don't
have time to explore it any further' at best.

~~~
white-flame
> _University is not structured to give you the time to go deep or be curious
> - you 're busy doing assignments, readings and studying for tests._

That statement, while true, should anguish the heart of anybody in any way
related to or concerned for academia.

~~~
bb88
Undergraduate coursework has always been that way. Especially in any truly
competitive environment.

You do, however, get that time when you write a master's thesis.

------
ggm
I think scheme was better educationally but harder pedagogy. It requires a
french metric tonne of time and contact time and patience to get some ideas
through. I speak as one who did FORTRAN 74 first and tried to learn LISP self
taught. Talking to others who did LISP first, what they did is recapitulate
the instantiation of fundamental data structures and algorithm concepts while
we just made 10x over engineered battering Rams out of rusty iron bars.

Python3 (which I live in btw) doesn't exactly exude it's functional elements
beyond the simple lamda you have to import from functools so recursive
solutions don't lend themselves to it syntactically speaking, inherently in
the language.

OTOH those battering rams come in handy for beating simple problems into
submission.

~~~
heyjudy
Yes, exactly.

I think there's basically only three ways to teach EECS: stack top-down,
bottom-up or as-needed/random.

If students learned bottom-up, starting at silicon, they'd have a firm grasp,
and be more mindful, of practical system capabilities. Python and such would
either be a starting crutch or a last learned.

Just like Scheme, a learning language, we had a learning OS: MINIX 2. Such
constrained environments are easier to teach but less relevant in the real
world, but the valuable part is the mastery of concepts is portable across
technological fashions... it instills a confidence that playing with the Linux
or FreeBSD kernel doesn't because it's so overwhelming to all novices, rather
unlike a minimal working example.

Furthermore, because of the bifurcations in tech, fewer people have even seen
a server much less racked a datacenter floor up, or understand assembly or C.
This is concerning if academia is mostly producing CS students narrowly
focused on web FE/BE.

------
zmmmmm
I feel like Python is too half baked to be a really good language to teach
fundamentals. By "half baked" I mean - sort of supports functional
programming, but not really. Sort of supports object oriented design, but not
really. Sort of allows type system theory to be taught, but not really. So
many sort ofs. It's a great language if you just want to get someone cranking
some code. But if I wanted a language to be a base from which I could explore
all the corners of computing I'd think else might make more sense (and having
written that I can't really think of a good one - Scala in theory, but in
practice it's awfully complex to get started in).

~~~
BerislavLopac
Wait, don't all these "sort ofs" make it actually an ideal tool for teaching
the basics? I mean, we're talking about basics, so what you're saying is that
you can teach the basics of a number of different paradigms with one language.

~~~
craigsmansion
> what you're saying is that you can teach the basics of a number of different
> paradigms with one language.

I don't know about Python, but that is one of the delights of SICP and scheme
in general.

It's not that there isn't a OO implementation in scheme; it's that there are
too many. Implementing an OO system is a medium-advanced exercise left to the
reader. As a result there are many available, up to the point one wonders if
"OO" can be rightfully coined "a paradigm".

Constraint based/logic programming? I think it's a chapter in SICP, and
covered by one of the SICP video lectures showing how to build a simple
prolog-like language to illustrate some other concept

Type systems? This one is a little funny. There's a SICP lecture on it (the
recordings are from the 80s), and I thought it was an unfortunate choice of
name, since "type systems" are a real and important concept now in modern
computer languages and the naming would clash, but no, it actually was an
implementation of a type system as we now use the term.

I think one of the reasons scheme doesn't have "a lot of stuff" is because
most stuff, if well understood, is trivial. If it doesn't seem trivial, one
simply doesn't have a deep enough understanding of the concepts yet.

One of the things about using scheme effectively as a teaching language is
that it puts the wizardry in the actual wizard, not in the magic box with the
blinkenlights.

------
jeromebaek
Can someone from MIT confirm that only Python is taught in the MIT intro to cs
class? We also teach SICP as the intro to cs class at Berkeley (cs61a.org),
and we start with Python, but then change to scheme by the latter half of the
semester.

~~~
Ar-Curunir
AFAIK the switch to Scheme is not complete, no? You're still writing Python
for the most part.

------
jancsika
I like to imagine Lazy University, where the committee meets and immediately
standardizes on somebody's offhand remark that the intro course should use
Javascript because it's probably running on every device the students are
using.

Or, replace Javascript with any other high level language likely to be running
on every smartphone. If there are competing offhanded suggestions then bonus
lazy points for using a random number generator to decide the winner.

Only when the committee meeting is over do Lazy U. profs begin planning how to
teach their classes. And there's more time for that since the Lazy U.
committee meeting was an order of magnitude shorter than most. (Although the
laziness would probably be recursive and result in the elimination of the
lecture, lab, geographic proximity to students, and assignments that can be
distinguished from publicly accessible repositories full of code and
research.)

------
makecheck
I love Python for a lot of things but I’ve come to realize how complex it can
really be.

I regularly see others write Python code that I have to fix to add robustness.
I also run "pylint" or equivalent tools before committing major changes to
scripts because it is quite easy to make mistakes that won’t otherwise be
found right away.

Some of my least favorite pitfalls:

\- If you call a function that _happens_ to refer to a variable that only
exists in the calling code, it will _work_ (and yes, since the same function
can be called from more than one place, “caller” is not always the same so the
effect is not always the same). Bonus points if it only matches the caller
because of a typo. Any convenience afforded by this behavior is not worth the
potential for head-scratching and painful debugging.

\- Python has things that “seem” like the obvious/right thing to do when they
are not. One great example is "except:", which to this day I have to keep
correcting in various scripts to prevent important errors from being
completely lost. Another one is expecting to write the entire script at column
0, when people are supposed to know 'if __name__ == "__main__"'. Also, a
type’s apparent shared/unique status is not always clear so novices can get a
_long_ way with sort-of working code until they are completely confused by the
broken cases (e.g. you can simply list and “initialize” class fields with
trivial string and number values _until_ one day you add something like a
"dict()" and everything breaks for that _one_ field because all your class
instances are mysteriously sharing it; suddenly you need to define a whole new
"__init__()" to fix things when it wasn’t apparently needed previously).

\- It is not always obvious to maintainers of functions that keyword arguments
are extremely fragile and that they can overlap with other arguments. If an
argument is renamed or relocated for example, stuff elsewhere can break in
stupid ways. Tiny code changes can create big headaches.

~~~
kbp
> If you call a function that happens to refer to a variable that only exists
> in the calling code, it will work

Could you give an example of what you mean by this?

~~~
Cu3PO42
Not the OP, but I think what they are referring to is the following:

    
    
      def foo():
        return bar
      
      if ...:
        bar = 42
        print(foo())
      else:
        print(foo())
    

If ... evaluates to a truthy value, bar will be defined and foo() will return
42, if it is falsy, bar will not be defined and the code will throw a
NameError.

I would agree that this is extremely confusing behaviour. Of course people
wouldn't write code like this (hopefully), but a similar situation might occur
in a much more obscure situation.

------
harrisonjackson
One of the best electives I took in my CSE program was Art in Code. We covered
Processing and Arduino and the only required textbook was Making Things Talk.
[https://www.amazon.com/Making-Things-Talk-Practical-
Connecti...](https://www.amazon.com/Making-Things-Talk-Practical-
Connecting/dp/0596510519)

These tools were designed to be accessible to non-programmers and had our
entire cross-disciplinary class making games, interacting with hardware, and
getting inspired to find new ways to interact with computers. I wish this
class had been required by every first-year CSE student.

------
b3b0p
Where I went to school, they didn't teach programming at all. First day, first
class, if I remember correctly, the teacher walks up in front of the class and
says something like, "Computer science is the design and implementation of
algorithms..." and finishes with saying they won't teach us any programming.
The assignments must be implemented in C and run on campus computers. There
are optional, zero credit labs to get help learning to program or get help
with homework held by the TA's.

I went to one lab, the lab computers and all campus computers were running
Slackware with Afterstep. I was a Window Maker and Red Hat user at the time, I
was impressed and thought it was neat.

Of course this is from memory and 18 years ago almost to the day, but that's
the gist of it. Anyway, I felt like I learned a ton and loved it every bit. I
went to a few of the CS club meetings and got the impression basically every
one had already had internships with Intel, IBM, or Microsoft. This was my
freshmen year and was basically in awe.

------
qwerty456127
> Then, what generaly happened was a programmer would think for a really long
> time, and then write just a little bit of code, and in practical terms,
> programming involved assembling many very small pieces

Oh how I bloody want to do jut this today. I want my program to be split into
as small parts as possible, every part (i.e. every function) being defined in
a separate file. But Python encourages the direct opposite - writing long
files of code to avoid introducing too many "modules" and this is a
debilitating headache.

> At some point along the way (he may have referred to the 1990’s
> specifically), the systems that were being built and the libraries and
> components that one had available to build systems were so large, that it
> was impossible for any one programmer to be aware of all of the individual
> pieces, never mind understand them.

Thanks to my ADHD it is very hard to me to fit anything that doesn't fit in
half a screen into my mind. The longer a module growth the lower my
productivity falls.

~~~
jasonpeacock
> But Python encourages the direct opposite - writing long files of code to
> avoid introducing too many "modules" and this is a debilitating headache.

How does Python encourage mega-files? AFAIK there's nothing that prevents nor
discourages proper organization of code into files.

I frequently use multiple files to organize my library implementations.

~~~
qwerty456127
Just some clues: A class definition can not be split in many files (e.g. in C#
you can do this easily with "partial class" declaration and that's amazing,
needless to say you can and almost always do split one module (a namespace)
into many flies there). Whatever you put in a separate file you will have to
import manually everywhere you use it and you have to care about circular
imports. You have to invent names all the time, a package, a module and a
function having the same name smells a headache. The C# namespaces definition
system could be made even better than it is already (i.e. by allowing class-
free functions directly inside namespaces) but it already feels just so much
better than Python "module = file" model.

~~~
joshuamorton
>A class definition can not be split in many files (e.g. in C# you can do this
easily with "partial class" declaration and that's amazing, needless to say
you can and almost always do split one module (a namespace) into many flies
there).

I think most would consider this a good thing. WYSIWYG. If a class is so
gargantuan as to require multiple files, it should probably be multiple
classes that communicate via a well defined, public API, not implicitly.

>You have to invent names all the time, a package, a module and a function
having the same name smells a headache.

This is only an issue if you are importing *. If you import in a namespace
(`from foo import module`, `from bar import other_module`), you can have
`module.module`, `other_module.module` and anything else all living in
harmony. If you work in python as python intends and not C/C++ (namespaced
imports, not text-prepended ones), it's much, much cleaner.

It's also possible to, generally speaking, have private submodules within one
larger module. You can do this with `__init__.py` by importing the various
submodules, explicitly re-exporting the things you wish to be top-level
public, and setting `__all__` to include them. This is why I can do something
like `import numpy as np; np.ndarray` even though ndarray is defined in an
extension module referenced by `numpy.core.multiarray`.

See multiarray
([https://github.com/numpy/numpy/blob/master/numpy/core/multia...](https://github.com/numpy/numpy/blob/master/numpy/core/multiarray.py)),
which pulls things from `numpy.core._multiarray_umath`, then exported by core
via `numeric`
([https://github.com/numpy/numpy/blob/master/numpy/core/__init...](https://github.com/numpy/numpy/blob/master/numpy/core/__init__.py#L60)),
then again by the top level __init__
([https://github.com/numpy/numpy/blob/master/numpy/__init__.py...](https://github.com/numpy/numpy/blob/master/numpy/__init__.py#L143)).

~~~
qwerty456127
> If a class is so gargantuan as to require multiple files

IMHO anything that doesn't fit in one screen is "gargantuan"

> If you import in a namespace (`from foo import module`, `from bar import
> other_module`), you can have `module.module`, `other_module.module`

Which looks ugly and feels a nasty headache if you want to maintain intuitive
vision of your code and dependencies structure.

> It's also possible to, generally speaking, have private submodules within
> one larger module. You can do this with `__init__.py` by importing the
> various submodules, explicitly re-exporting

I know but even reading this paragraph hurts. Too much mess to manage
manually.

~~~
joshuamorton
>IMHO anything that doesn't fit in one screen is "gargantuan"

This is a bit of an odd definition, but sure (I've seen C++ files whose
imports alone were gargantuan under your definition). I just looked through a
production python application, and it there were 2 non-test classes that were
more than 100 lines long. Both were relatively verbose (well commented, use
type hints so argument lists use significant vertical space, etc.)

In general, it seems like you're limiting yourself to, with the necessary
boilerplate, one or two functions in any file that isn't "gargantuan". This
makes it needlessly difficult to understand the structure of your applications
since you are forced to split related logic up among multiple files. This,
more than screen length or the import syntax, makes it much, much harder to
"maintain intuitive vision of code and dependencies structure" as you say.

>Which looks ugly and feels a nasty headache if you want to maintain intuitive
vision of your code and dependencies structure.

Not at all. It's more explicit about the dependency structure (`module.method(
_args)` at a call site gives you a much better idea of the structure than just
`method(_ args)`), and so makes it significantly easier to maintain an
intuitive vision of the code and dependencies.

It's incalculably easier to understand dependency structure when using the
`import module` syntax than `from module import Class` or `from module import
*`. So much so that the former allows reliable, large scale refactorings
without any runtime information. The others, as you correctly recognize, do
not.

>I know but even reading this paragraph hurts. Too much mess to manage
manually.

Then don't! You don't need to. It's really only necessary for truly
public/widely used APIs (like numpy) where understanding the internal
structure of the module is not worth it for the average user.

~~~
qwerty456127
> Then don't! You don't need to. It's really only necessary for truly
> public/widely used APIs (like numpy) where understanding the internal
> structure of the module is not worth it for the average user.

I mean for the library developer, not for the user.

------
Santosh83
I contend that within reason, it really doesn't matter exactly which language
happens to be one's very first programming language. Programming is ideally
motivated by passion, and in that case, a person is certainly not going to
give up and change career just because their intro language was a bit
inconsistent or rocky. Anyone who gives up programming before at least being
fluent in three languages, is giving up prematurely.

This doesn't mean assembler or brainfuck is okay as a first language (although
the former was the 2nd langauge I taught myself and its reputation for
difficulty is vastly exaggerated in my opinion), but it does mean that endless
discussions about whether Python is better than Javascript or C is better than
Lisp is probably splitting hairs.

~~~
taneq
Assembler is arguably a better intro language than most. It makes it
explicitly clear what each step of the programming does and how it executes.
Once students have done a few projects the they'll be ready to appreciate a
higher level language like C.

Throwing objects and lambdas and type inference at kids without giving them
any grounding in what a computer actually is and does is unfair, and leaves
them building castles in the sky.

~~~
ohazi
I can't tell if this was meant to be serious of facetious, but I took this
route. I learned 8088 assembly to make an ancient robot platform move, and
then learned C a few months later.

Assembly had a few weird architecturey things that you had to understand
before things made any sense (e.g. registers, segment:offset addressing, stack
pointers, calling conventions), but beyond that, all of the tasks were simple
and self-contained.

C was a breath of fresh air after that. Things like pointers, which have a
reputation for being confusing, make perfect sense coming from assembly.

~~~
taneq
Serious. I'm not saying you should stick with assembler for any great length
of time, but understanding what instructions and registers and memory _are_ is
pretty important later on. In university I had fellow students who struggled
with pointers because even at the end of first year they didn't really
understand what memory _was_ or how it worked.

~~~
gdy
[https://stefansf.de/post/pointers-are-more-abstract-than-
you...](https://stefansf.de/post/pointers-are-more-abstract-than-you-might-
expect/)

------
xte
Bullshit translator: mean intelligence of newcomers becomes so low that we
have to lower our programs or only veeeeery few can attend them. And we need
more students so more money. Disagree? Say thanks to all school reforms in the
past 30-40-50 years.

A bit of extras: in today world we can't produce really anything new due to
managerial-driven society and beside that it's better people do NOT really
understand how things work to avoid not only embarrassments but also potential
idea to radically wipe out certain kind of business models. So better saying:
you can't understand the big picture! Keep play with thing Bigs&powerful give
you, they came directly from the nature do not worry about it.

Sorry to have made few "potential-flamey comments" in a row but reasons
written in the article have really no other meaning to my eyes. That's not
intended to be political or flamebait, it's simply what I see. I do NOT start
with scheme, I discovered it years after and I even do not love it too much
due to it's n-th implementation incompatibilities, srfi and "sparse"
documentation... But when I discover lisp-like languages in general I see a
completely different world and really change my mind. Guile scheme, even with
the lacks of libraries etc it's now one of my preferred way to express
concepts.

Also as per explanation I can't accept as engineer and even as a citizen to
make thing, under my responsibility, with my signature etc that I do not
really understand. This kind of idea is simply the exact opposite of
engineering.

~~~
akhilcacharya
> mean intelligence of newcomers becomes so low that we have to lower our
> programs or only veeeeery few can attend them. And we need more students so
> more money. Disagree? Say thanks to all school reforms in the past 30-40-50
> years.

uh, you know this is about MIT right? A school that has seen its acceptance
rate dwindle to nearly 7%?

~~~
xte
Yep, and I also see few of their on-line filmed lectures and form my present
opinion...

I'm European so I came from a really different (older/less reformed) school
system and I'm old enough to see how it's change. I also casually interact
with USA people and my conclusion is simple: having really poor schools
"because there is college after" it can't really work. I see in EU, reform
after reform, how unprepared become high-school students and how hard, if not
impossible, universities try to "recover" them changing their programs. In the
USA this phenomenon it's extreme so even with a super-low acceptance rate
results have to be also extreme.

You simply can't get smart but ignorant people and in few years transform them
in high-skilled and acculturated guys. Being smart help of course, but learn
require time. The better you start, early, the better you'll end up after.

------
craigsmansion
The article mentions Prof. Sussman in "the great macro debate".

This reminded me of a phrasing I once read, but never quite understood, in the
"acknowledgement" section of the R4RS Macro appendix:

"Hal Abelson contributed by holding this report hostage to the appendix on
macros".

Can anybody who was/is closer to the subject matters explain what this
entails?

------
dang
Lots of previous discussions:

[https://hn.algolia.com/?query=mit%20python%20scheme%20points...](https://hn.algolia.com/?query=mit%20python%20scheme%20points%3E10&sort=byDate&dateRange=all&type=story&storyText=false&prefix=false&page=0)

------
stealthcat
basically instead of building your own pipelines and valves and pumps and
reservoirs, etc. from scratch, today those thing are already available and
proven to be working.

so you only need to learn to be a plumber.

------
pbreit
Better question is why they'd use the completely impractical scheme in the
first place.

~~~
hopler
Because it was intro to computer science, not intro to computer practicalities

------
brianberns
Python makes sense for engineering, like building robots, where practicality
is the most important goal.

However, Python is a lousy choice for computer science, where clarity and
correctness are much more important. It saddens me that an elite institution
like MIT apparently can't see the difference.

~~~
bb88
To be fair, I think many people expect that a BS degree in either computer
science or engineering would make them employable.

~~~
brianberns
Sure, but the emphasis is different. Someone with a CS degree should
understand both theory and practice. Someone with an engineering degree
doesn't need much theory.

FWIW, I have a CS degree, and my job is mostly engineering, but using a sound
theoretical approach to problems has saved me many times. On the other hand,
I've worked with several good programmers over the years who didn't have a
college degree at all.

~~~
bb88
About 15 years ago companies stopped hiring people that could grow into a
space, and started hiring people that could immediately work in one.

------
zodiakzz
I just wish there was a simple way to serve web pages in Python like PHP,
without having to learn a whole framework/stack/templating language.

~~~
tiglionabbit
Flask is not bad.

------
DonHopkins
For it is intro??!

Edit: Never mind, its good now. ;)

------
jackalo
Perl or bust.

------
gamma-male
I had to learn scheme in uni, everyone hated it. It scared me away from
functionnal programming languages for years until I understood that
parenthesis are not necessarily tied to FL and cool FL exist like erlang or
ocaml.

------
revskill
I think it's because if,then,else in Python is less verbose than LISP family
language.

