Hacker News new | past | comments | ask | show | jobs | submit login
Why MIT uses Python instead of Scheme for its undergraduate CS program (2009) (cemerick.com)
131 points by bb88 on Dec 29, 2018 | hide | past | favorite | 136 comments



> However, he did say that starting off with python makes an undergraduate’s initial experiences maximally productive in the current environment.

I read this as "the parents paying tuition wanted their kids to learn practical skills so they can make money".

I learned Scheme as a first language and I will be forever grateful for it.


I learned Python as my first TRUE(explain later) language and I will be forever grateful for it.

Now I am a polyglot, jumping through multiple languages all the time, be it Python/Java/Javascript, or occasionally flirtation with some languages that are foreign to my daily routine but necessary to get my job done.

I am not quite on board with you claim that Python is chosen for monetary benefits. To be a qualified programmer, there is so much more need to be learnt beyond Python. Python however, is a GREAT choice to overcome the initial fear of writing code in general, because it is more tolerant.

My actual first language is C, and before my college I have never done programming before. I have to say, looking back, learning C as my first language without fully understanding the modern computer architecture is a horrible experience: so many questions as to why things is what it is now, so many traps, crazy pointers, cryptic errors, nightmare segmentfault. Programming is just very frustrating experience to me that the time. Of course, later on, it all makes sense, even pointer seems like a great idea(you can't have more freedom than that), just that we need to put great efforts regulating the use of it to prevent irregular behaviors.

On the other hand, Python is a tolerant language, yet providing a pretty comprehensive sets of primitives to play and experiment with. Had I started with Python, I might not waste the initial year struggling and questioning myself whether I could program at all.


Compared to C, Python is the better choice. However, it seems like you haven't learned a Lisp dialect. I would love to know if you would still prefer Python once you know Scheme.


I have to agree. Scheme is so much more elegant and so much simpler than Python.

People often talk about how simple Python is, but many of them have either no experience with or no appreciation for Scheme. When they do have exposure to Scheme, it tends to be some ancient, primitive dialect like MIT Scheme rather than modern, full-featured scheme like Chicken, or Racket, or Guile -- so they look at Scheme as a toy language. I don't know how often I've heard comments along the lines of Scheme being a fine academic language but impractical for real work.

Python does have a little nice lispyness to it, but it's deliberately limited by the language designers, the syntax is lacking and it has many frustrating inconsistencies and gotchas. Programming in Scheme is just so much easier.


> Compared to C, Python is the better choice. However, it seems like you haven't learned a Lisp dialect. I would love to know if you would still prefer Python once you know Scheme.

the school I went used to teach C, Python and Scheme in the first year (nowadays it's C, Python and Racket). I don't think I remember more than one or two people actually liking the LISP experience, how bad it was when comparing to other languages is actually a common joke subject amongst alumni.


I also went to a school where racket was the intro programming language, and during that class, most of the student also hated it (myself included). However, over time, I found that those who stuck with CS and had to learn C(++), JS, etc, came to appreciate racket.

The issue with the line of reasoning that “X language is bad because it was bad during an intro programming class” is that your experiences in that class do not generalize well. Sure, python might be better for writing some basic algorithm that you fully understand before writing a single line of code, and for fairly small codebases. However, try writing a large codebase and you’ll realize that, for example, managing state is really hard.

Sure, racket and other functional languages might require a greater learning curve than other languages, and of course they’re not the language of choice for performance critical applications (nor will Matthias Felleisen claim it is). However, as a systems developer who uses almost exclusively C++, I would argue that starting out with a language that forces you to think about your contracts, mutability, scope, etc, invariable create a better programmer down the line


The language known as Scheme has done a lot of damage to Lisp's image. When undergrads are exposed to Scheme, they tend to forever carry a negative image of Lisp by association.

A lot of the time when you meet someone who had a bad experience with Lisp, if you interview them a bit, you soon discover it was actually Scheme.

Scheme twenty years ago, R5RS was even worse than now. It had nothing practical in the spec. No way to write a program consisting of multiple files. No error handling. R5RS talks about situations that trigger an "error", but nothing about how such a thing can be handled and recovered. Or how an error can be generated on purpose and then caught.

Anyone who studied R5RS in school was learning an utter piece of academic garbage; a serious regression from real Lisp.


I agree that there's a problem of people (even the people who rave about how enlightening it is) misunderstanding that a toy Scheme interpreter is what Lisp is, but I don't think that's an issue with Scheme. Introductory FP classes that use Common Lisp don't leave any better a taste in students' mouths, and real world Scheme is a lot more capable than a toy implementation designed for pedagogical purposes. I don't think the standard's conservatism is very much a factor.


They tried many LISP dialects over the years: Scheme, CommonLisp and Racket - and I don't remember student reactions ever changing about it.


I'd be interested in what aspects strike people as bad. (Honest question. If you reply, I promise not to come back with "But, but, but ...")


I will answer your question through its dual : the people who love LISP and functional programming are in my experience people who love maths - as in, algebra, etc. . You can easily recognize them, because they say weird things such as "this demonstration is so elegant !".

Functional programming of course maps (heh) very cleanly to this line of thought.

But most people hate maths and this way of thinking. In contrast, you get first year students "re-discovering" OOP ever year - for instance a common trick to make them learn design patterns is just to put a problem that calls for it in front of them, and three times out of four in my experience they will even come up with a pattern name close to the original ones.


Thank you.


One thing that comes to mind is that looping is limited to recursion in scheme. This becomes quite coumbersome IMO since it is a pretty important thing. Writing named lets for simple things is not anyone's preferred way.

It is however pretty easy to implement something like racket's for loops on scheme using lower level macros, and there are some available.

Implementing something like common lisps iter is actually not very hard, and even improving upon racket's for loops is far from complex.


> looping is limited to recursion in scheme

I believe that Scheme's DO is looping.


Which is still painful to use compared to python's for loops or comprehensions


Regarding C, my first language was C and I've always been thankful for that. It helped me understand low-level programming, memory management, pointers, registers, etc. It also helped me better understand how my computer's hardware works. Having C as my first language also gave me a greater appreciation for languages that handle all that complexity for me.

Regarding Python vs. Scheme, I like both languages. I tend to prefer Python for certain types of programming, like NLP, machine learning, and math, just because the available libraries are so good. Having said that, I feel that the functional paradigm is superior to procedural programming and definitely (for me anyway) better than OOP, so for everything else I'd choose Scheme or some other functional language, like Elixir, Elm, Haskell, or Erlang, depending on my mood and what I'm doing.


I think both C, Python and Scheme are all perfectly fine choices for a first language. It really depends on what you want to do.

Want to learn how a computer works? Learn C.

Want to study abstract comp-sci concepts? Learn Scheme.

Want to wire together libraries to get something running in the messy real world? Learn Python (or something similiar).

A well-rounded CS education really ought to include all of these to at least some degree.


I have working knowledge of C, C++, Scheme, and a little bit of Python, but I really don't think Scheme is true to all its hype. I'm not even sure if everyone praising it here is using it in real life projects or not or only academically. We do, and man, the tough life is with it. The only debugging we have here is numerous "format (print)" statements, it is terribly slow, and may be not everyone's using it as intended, but the way it promotes "lists" for everything is just plain wrong IMHO. End result, a lot of wrongful uses of searching in lists where a proper C/C++ data structure would prove much faster.

Oh, and the deluge of useless parenthesis that it is, it hurts my eyes to read that code.


> deluge of useless parenthesis that it is, it hurts my eyes to read that code.

Oh, how I wish I could share the joy I experience using Paredit with those parentheses. Paredit turns parentheses from an inconvenience into a turbo button for modifying code at an entirely new level of abstraction.

When you started coding, you probably used an editor that worked on units that were single characters. Think Notepad. Then maybe you learned a programmers editor like Emacs, Vim, or Atom or Sublime. There you learned to cut, copy, fold, spindle, and mutilate code in lines or paragraphs.

But Paredit... oh Paredit... now you have the tools to work with your code not in characters, lines, or paragraphs, but in code forms—the same stuff your code is made from.

And the best part is that Paredit doesn’t have to have a deep knowledge of your code, groping imperfectly like Intellisense. Paredit is deterministic and easy to implement.

If I could have an S-expression based version of every other language simply for paredit, I'd take it.

Now a story for a different day is how much I hate typing commas in non-lisps.


Which scheme are you using? Chez, chicken, guile and gambit all have pretty good debugging facilities, and a macro like chez's trace-lambda is a lot better than misusing print-debugging in most cases.

Nobody ever said most list operations is anything else than o(n), and if you are using lists you are probably using the wrong data structure. Vectors or hash tables would probably be a better choice (which are provided by all serious schemes).


Ours is probably some 20 years old initial MIT Scheme implementation added with homegrown features like "Object Oriented Scheme" etc., but a debugger was never implemented for the same. Talks have been going on to move to an implementation that has a debugger, but at such a scale changing that would be difficult.


By MIT scheme, do you mean MIT/GNU scheme? A few schemes implement their debuggers in terms of call/cc (or delimited counterparts) and exceptional conditions like errors or breaks hand control to a new 'depth' of REPL where anything in the environment can be inspected (or redefined) in-context before possibly trying to continue or doing any unwinding. If your scheme has no debugging apparatus, it shouldn't be too hard to implement something a little more useful than sprinkling in print statements. Even without wading into continuations, implementing a macro that does tracing should be possible.


Ouch... That seems painful and really like someone made a wrong decision 20 years ago. Back then scheme was a pretty inane choice compared to CL. That is not as much the case today, regardless of what the CLers say :)


> I would love to know if you would still prefer Python once you know Scheme.

I did. Actually, when I found Python (some 20 years ago), I thought - "Here's it - Lisp for real world".


> I am not quite on board with you [sic] claim that Python is chosen for monetary benefits.

I can see it. I used to make a lot of money doing Python programming. About $40k a year more than I ever made doing C, Java, C# or C++. Of course, I did Python more recently, so there is the wage inflation between 1992 and today to consider. Still, you can make a lot writing Python code.


I find this affirmation, while perfectly legit and natural, a really sad indicator of today's state of IT: normally I expect a programmer to be payed because of it knowledge and time, used to solve problems. In that sense languages, libraries etc are simple tools so a metric like "$tool make me earn $amount of money" is a nonsense, something like a carpenter saying "drill make me more money than saw"...

Of course, some tools maybe more effective than others, but I still find sad measure in money terms any of them...


~70k today. Programming python in 92 - nice :)


@xte - I agree with you for the most part; however, I think there are some supply-and-demand considerations to be made. For example, companies who use esoteric or obsolete languages, where the workforce who knows that language inside and out is particularly small (a la Cobol programmers circa late 1990s), should expect to pay more. Where I live, because of the area being inundated with technology boot camps, there is an exorbitant number of JavaScript programmers. Since you can't breathe without huffing a JavaScript programmer, they tend to be paid less.

Also, some languages and applications require a higher degree of knowledge, training, and expertise than others. For example, I would expect to pay a programmer working in robotics more than I'd expect to pay a React or Angular developer.

Having grown up working in construction, I completely agree with your tool analogy though (and used the same analogy in another comment).

@gronne - About three times that, but close. ;)


If you adjust only for inflation then its about 82% cumulative 1992-2018.


Congratulations on being a polyglot! In my opinion, all programmers should be polyglots, since some languages are better than others at various tasks. A programmer who only knows one language seems a lot like a carpenter who is only skilled with measuring tapes to me.


I would go further than that and say that it's also important for programmer to know different paradigms. Knowing C, Java, and Python is surely better than just knowing Java, but they're still quite similar. Ideally people would also learn some functional programming, even if they don't use it, and a Lisp dialect (to appreciate the value of good metaprogramming facilities and get over parenthesis-phobia).


Counter-point: Everyone is grateful for our formative experiences because those experiences shape our identity, and we're obviously heavily emotionally invested in who we are.

But we highly overvalue which formative experiences matter. If I were to get into a time machine, go into your past, remove Scheme and swap in some other language, you'd probably be grateful today for whatever that other language was. It just hit you at the right time, like whatever song was on the radio when you first fell in love.

Every step on our path is dear to us because it got us where we are today, but there are many many paths and little reason to want others to follow in our footsteps. If anything, we should discourage that because the world doesn't need another me or you. It's already got those.


This is exactly it. Nostalgia is overpowering.


Same here. When I learned programming at UC Berkeley in the 90s, Scheme was the intro language. It was perfect precisely because it was so far from being the kind of language you'd use to engineer large projects. The first day, the professor spent less than half of the period introducing the entire syntax of the language...it's so damn simple. Beyond that, all learning was abstraction, problem solving and CS concepts. It was learning how to take a very limited set of primitives and construct a larger whole.

I pity the MIT students that will waste so much of their intro course learning what is a vastly more complex language. There will be so many students that will struggle to learn python-specific topics that are irrelevant to learning the act of coding and may decide that CS and programming just aren't for them. Scheme is such a perfect language for this task because it puts so little between the student and the crucial experimentation phase that can instill a fascination/curiosity with programming. And now MIT students will miss out on that.


We also live in an era where it's basically impossible that kids getting into MIT's CS program haven't been exposed to programming before. I think it's a romantic idea, but the fact is college isn't where you learn to program, it's where you learn the time complexity of btrees.


This is a really significant point. I mean, I love SICP. I think Scheme is elegant as hell. But I learned to program with Basic in high school, and SICP in Scheme would have seemed like a huge step backward. I know Fortran did, which was my first college-level experience. I can appreciate Scheme now, but I don't think I would have then. And I can only imagine how "kids today," who have grown up on real languages would feel ("real" in contrast to Basic, not Scheme).


> it's where you learn the time complexity of btrees.

I’m having flashbacks... Thanks for that ;)

Time complexity was my worst CS topic. At least, for implementing the most efficient system. I’m not an algorithms guy.


> There will be so many students that will struggle to learn python-specific topics that are irrelevant to learning the act of coding

Correct me if I'm wrong - I just went to an unrespected state school - but I would think that the MIT admissions filter selects for students who wouldn't particularly struggle with something as small as language idiosyncracies, especially in an introductory course.


No, you're right. I'd go so far as to say even people at my unrespected state school wouldn't struggle with things like that.


Hm, perhaps your state school is "less unrespected" than mine, but my CS program had a very real funnel from "couldn't hack it in Programming 1" to "couldn't hack it in Calculus 2" to "couldn't hack it in Data Structures and Algorithms" to "couldn't hack it in one particular professor's mandatory upper-level courses" to "tripped on their gown."

I digressed a little. Point being: while I'm unsurprised that some % of people from a broader range of abilities couldn't get functions or iteration or inheritance or whathaveyou to stick, I'd be very surprised if ANY % of the culled-by-18 right end of the Bell curve couldn't grok both fundamental concepts and any relevant part of the standard documentation (on which we appear to be in agreement.)


If you have never been exposed to programming before chances are you will struggle no matter the name of the school. The name will, most likely, help you get a better job but don’t worry about the skill level attained (unless you went to something like Trump university).


> I pity the MIT students that will waste so much of their intro course learning what is a vastly more complex language. There will be so many students that will struggle to learn python-specific topics that are irrelevant to learning the act of coding and may decide that CS and programming just aren't for them.

As someone who has been part of running intro courses using a variety of languages: While I like the Scheme path for various reasons, I do not recognize the effect you imagine at all from the students I've seen. That there are complex areas of Python is basically irrelevant for intro courses, the few pain points are really easy to explain and remember, and Python is great for the experimentation phase (the core is easy enough, the (standard) library ecosystem helps satisfying random interests and can generally also be used without being exposed to problematic parts of the language, REPL)


> It was learning how to take a very limited set of primitives and construct a larger whole.

By this measure C should be a popular university intro language too, but it is famously unsuitable as a first language, at least according to academics.


On the one hand, Scheme has an even smaller set of primitives than C. On the other hand, it's far more difficult to construct the larger whole from those primitives. Additionally, you have all the problems from the compiler when you have to manage libraries, include and object files. With some guidance when students are stuck with the compiler, C is not such a bad language to learn programming, but it doesn't hit the same sweet spot as Scheme.


At Berkeley we've switched out the SICP-based 61A for a Pythonic 61A, similar to MIT. It certainly hasn't driven away any CS students; on the contrary, the number of CS majors has exploded.


The article is 10 years old. Has MIT graduated a generation of failures?


Python was my first, but the computer science course at my university uses Scheme for one of its primary required courses. Really kicked my ass with man-handling recursion and lambdas to no end. Implementing trees (usually binary) gave me a hell of a time. I got so sick of opening DrRacket by the end of the semester.

That being said, I did respect the course and the language. I didn’t understand the science part of computer science before starting the minor and that course opened my eyes pretty wide (as did the Data Structures course). It really forces you to think and understand what you were doing, much less poking and prodding (different from trial and error) than what you could do with Python and its libraries. Plus there’s significantly less information out there on it in comparison. I wouldn’t bring parents into it, they have almost no influence on universities unless they are privately funding/donating.

But my god those parentheses...


> It really forces you to think and understand what you were doing, much less poking and prodding

This is exactly what I look back on so fondly. I already had a lifetime of fiddling with libraries and stack overflow ahead of me, exploring pure computation was a completely eye-opening experience.


I learned Python long before touching anything like a lisp, and the closest thing to functional programming that I’d done before touching a lisp was R. All self-taught, usually in the service of something else I wanted to do. I have no formal training in CS.

And so my use of Racket is simply as a dilettante. And I LOVE it. I have a lot of imposter syndrome and not really grokking data structures the way CS people can tell you the tradeoffs and implementation details of a B-tree. But, I LOVE coding in Racket when I get the chance.

For me, it’s a language where I can jank-hack things together like Python, but having more fun. Wonder if that’s because I never had the fun of Racket sullied by also having to learn data structures and CS fundamentals at the same time?


Do you think that "How to Design Programs"[1] is the best intro book? Should it be followed by SICP[2]?

[1] https://htdp.org/ [2] https://mitpress.mit.edu/sites/default/files/sicp/full-text/...


I really like htdp, it was used for a bit in my intro class as well, but that curriculum also covered concepts that weren’t from the book. I also have been forever grateful for being taught scheme as my first language.

Previous discussion on HN about the second addition:

https://news.ycombinator.com/item?id=14932552

EDIT: I feel like it is not all-encompassing but it is a good start. It is definitely unique in the concepts that it covers.


I think both are valuable: HTDP is the better intro but SICP scales further. For me an ideal CS101 course would be 3/4ths HTDP with a "best of" SICP tacked on at the end.


Scheme is not my first language and I am still greatful for it. I am self learning SICP for past 3 years (after working 15+ years in non-lisp languages). Though still in chapter 2, it has cleared up and simplified my thinking about programming. I no longer consider bottoms up approach to be sin. Understanding new functional language specs becomes a breeze and it 'just clicks'. Though I am yet to understand why Clojure could not implement tail recursion without the 'recur' keyword.


They could for all cases where recur is presently used. Recur serves to indicate and document that the function is supposed to be tail recursive and will cause an error if this is not proper rather than silently not being optimized and blowing up when fed too large an input.


This was a limitation of the JVM.

I vaguely recall one of the recent updates removed that limitation, but could be wrong about that.


If it was any other school than MIT I might agree, but since it’s MIT I doubt that is what the parents are thinking.


If the parents are really that concerned about their kids learning the wrong first programming language in college, then they're not helicoptering enough! Why didn't they teach their kids to program JavaScript or Logo when they were in middle or elementary school?

https://snap.berkeley.edu

https://bjc.berkeley.edu


I couldn't be more thankful that python was one of my first languages, and this is something that I've passed on to everybody I've ever mentored.

Python is a real language that does real things, but it's also completely approachable to a newbie.

If programming seems hard, then you have a bad teacher that is probably just trying to evangelize their hobby to you.

And god help anybody that tries to start with JavaScript.


I don't get why people hate on starting with JavaScript. Since ES6, JS strikes a really nice balance between practical usage and theoretical value. Like Scheme it's a dynamically typed language focused around a single data structure (list for Scheme, object for JS), with first class functions. Sure, it has weak typing and there's some scoping complexity. But, that's a completely reasonable tradeoff for being one of the most useful languages in the entire world. Not to mention, JS's weird casting issues are not as common as people make them out to be and const/let solves a lot of the scoping problems.


I think JS is a great first language. In my experience as a teacher of programming, most students are not like me. Unlike me, nearly every student of my acquaintance is motivated by making things. It's easier in JS to make things that are visually interesting, that are demoable to your peers and family, that are easy to deploy to a variety of useful environments.

JS is a good starting language, for the typical starting student, because it helps them cultivate their intrinsic motivation.

Less importantly, but still important to me: JS is also a more beautiful language than most mainstream languages, except for the ten or fifteen god-awful warts that it admittedly has, that everyone is disproportionately obsessed with.

(I, unlike most of my students, was interested in computational thinking from the very beginning: iteration, recursion, data types, algorithms, etc, and I was uninterested in actually making things. My first languages were BASIC and Pascal and C, and as far as I can tell, that worked out okay.)


I disagree completely - javascript encourages shitty programming by design and by culture. If you don’t expose the young chances are they won’t get infected.


Because all of those who started on something like C, C++, or Java have way better programming practices? My experience tells me that starting language is literally not a vector in determining how good of a programmer someone is.


I agree. Maybe its the barrier to entry for publishing that has gotten a lot lower and thus pushing down the quality of public code (which is natural). Or maybe im just worried about everything being pushed to the browser.


This is absolutely nonsense. Modern JavaScript is powerful and actually quite elegant. Perhaps stop thinking about the crappy JavaScript people wrote >10 years ago.


Maybe some are. There is however nothing nothing elegant about running 5 massive electron, node etc apps in your browser w. 60% code overlap grinding everything to a halt. Its not so much the language but how its being used.


Because programming should be descriptive to what you want the computer to do, and the way that humans explain things to each other is usually linear.

For instance: brush your teeth, then put on your clothes, then get in the car, then start it.

The "JavaScript" way to do this is that starting your car is somehow nested inside of the brush your teeth event. Everything is a callback of everything else, so trying to explain to your computer what you want it to do ends up as a giant spaghetti mess.

Obviously there is a JavaScript way of thinking that allows you to think of brushing your teeth as a dependency to starting your car, but I don't think that this is how most humans think by default, so it's a terrible way to teach students.

It makes them think that computers are these complicated things that take some immense skill to operate, but they're not anymore. You just need to tell the machine what to do, and good languages like python make that really easy.

At this point in my life, I'm mostly writing C++, JavaScript, and golang, but I'm extremely thankful that I started with python. When I need to bang out a prototype, or test something in code, it is always my go to.


Modern JS fixes this (mostly). Promises and async/await (ES5 and 6) address those issues. What was previously

    function first(cb) {
        console.log('first');
        cb();
    }

    function second() {
        console.log('second');
    }

    first(second);
can now be expressed as

    new Promise().then(console.log('first'))
                 .then(console.log('second');
or

    async function first() {
        console.log('first');
    }
    async function second() {
        console.log('second');
    }
    async () => {
        await first();
        await second();
    }();
depending on exactly what your goals are. The inversion of order ("callback hell") can be avoided if you stick to modern concepts.


Okay, in python this is:

     print "first"
     print "second"
I mean...imagine explaining your example to somebody. That's difficult for some programmers to fully understand.


No. In python(3.7+) this is

    import asyncio

    async def first():
        print("first")

    async def second():
       print("second")

    async def _():
        await first()
        await second()

    asyncio.run(_())
prior to python3.7, python didn't have similarly clean asynchronous programming tools. With python 3.5+, you could use the old event loop syntax [1] to do it, and prior to that, you needed to use `yield` and `yield from` for similar semantics.

If you want synchronous stuff in JS, this suffices:

    console.log('first');
    console.log('second');

EDIT:

The problem that JS had (prior to promises in ES5) was that the only way to do deferred/asynchronous things (like "run this after I get data from a network call") was to provide a callback, something like

    function(resource_url, callback) {
        data = get(resource_url);
        callback(data);
    }
This gets very trick very fast if you want to have chained calls (imagine that `callback` also conditionally requests a resource, and you want to do something with that resource, and based on that you may want to redraw the DOM and then...).

There wasn't a good pattern for describing that in JS. The "common" pattern was to just have callbacks within callbacks, and unlike in python you'd often use anonymous functions, so you end up with nested anonymous functions which inverts the way you think, it's really hard to grok.

Promises and async/await linearize that, but that's a problem that python never really had to address because up until very recently, python didn't get used for async stuff.

[1]: https://docs.python.org/3.6/library/asyncio-task.html#exampl...


Agreed. Building a web app is more "real" to many who are just starting out.


I had not coded in a long time. Not written any type of helloworld.c in years (more than a decade) I've recently updated and wedged in a programming ENV: on the machine I spent most time on these days.

I say wedged because IMHO programming env's are the easiest to setup and maintain in POSIX OS's. This closed source OS I'm most in (because I stream simulations & games on this DAW) is not really suited for it.

One of the 1st languages I installed was python. For me the reason is a nice Raspberry PI I got from a friend of mine, but most of all the accessability of the language on literally all OS's you can throw a stick at.

Another reason is

  python -v -m SimpleHTTPServer 8088
amazing, when you run it in verbose mode


php -S 127.0.0.1:8088 is a good one as well, it helps with debugging PHP quickly (I still am in the boat as PHP being a better lang for handing web stuff)


10 years ago; I'd really like to hear a post-mortem of this decision.

For context, MIT's previous EECS curriculum began with 4 required courses for all EE and CS majors. This meant that EEs had to suffer through Scheme, and CS'ers had to suffer through circuits. I'm sure everyone has their own opinion on whether this is a good thing or a bad thing.

Regardless, the EECS department switched to a new 7(?) "core" class format in which students choose 4(?), giving students the freedom to take more relevant courses to their interests/major. In this switch, 6.001 (scheme) was out and replaced with a more hands-on robotics-based Python lab class.

I can't really say which approach is better, but I do think there's an important place for the more pure "thinking" computer science classes too, even if it's not the first class taught to freshmen.


I hire people coming from this program. I can no longer rely on students understanding control systems, having read Leveson's therac paper, having worked with a notebook-style REPL (I.e., something stressful to model exactly in your head, promoting confusion, leading to notes and comments and structure.)

The near-total demise of Athena matters here too. I can and do hire people with all those skills! It’s still very possible to get them at MIT. It’s just not nearly the default, and since it’s way up in electives I’m at least as likely to find the best people carrying a degree that says 21W as VI-3.

I don’t envy the VI admins. They have a problem akin to that of a suddenly virally successful startup. 400 people a year come in; about that many better succeed.


As someone who's read SICP when first starting out and re-reading it and re-watching the lectures much much later - I can say that 95% of the value of SICP flew completely over my head when I was first learning.

I remembered catch phrases like 'assignment is bad', only to be led down the 'functional programming' cult years later because it reminded me of sicp.

Building up entire systems from first principles is quite beautiful and illuminating - once you've used existing systems and began to wonder how they came to be.

For most people - I don't think they're really that curious. University is not structured to give you the time to go deep or be curious - you're busy doing assignments, readings and studying for tests. SICP is a gem for those who are lucky to have a few months to really go down the rabbit hole. For everyone else, it'll likely feel like needless torture or 'wow, that's cool and I don't have time to explore it any further' at best.


> University is not structured to give you the time to go deep or be curious - you're busy doing assignments, readings and studying for tests.

That statement, while true, should anguish the heart of anybody in any way related to or concerned for academia.


Undergraduate coursework has always been that way. Especially in any truly competitive environment.

You do, however, get that time when you write a master's thesis.


I think scheme was better educationally but harder pedagogy. It requires a french metric tonne of time and contact time and patience to get some ideas through. I speak as one who did FORTRAN 74 first and tried to learn LISP self taught. Talking to others who did LISP first, what they did is recapitulate the instantiation of fundamental data structures and algorithm concepts while we just made 10x over engineered battering Rams out of rusty iron bars.

Python3 (which I live in btw) doesn't exactly exude it's functional elements beyond the simple lamda you have to import from functools so recursive solutions don't lend themselves to it syntactically speaking, inherently in the language.

OTOH those battering rams come in handy for beating simple problems into submission.


Yes, exactly.

I think there's basically only three ways to teach EECS: stack top-down, bottom-up or as-needed/random.

If students learned bottom-up, starting at silicon, they'd have a firm grasp, and be more mindful, of practical system capabilities. Python and such would either be a starting crutch or a last learned.

Just like Scheme, a learning language, we had a learning OS: MINIX 2. Such constrained environments are easier to teach but less relevant in the real world, but the valuable part is the mastery of concepts is portable across technological fashions... it instills a confidence that playing with the Linux or FreeBSD kernel doesn't because it's so overwhelming to all novices, rather unlike a minimal working example.

Furthermore, because of the bifurcations in tech, fewer people have even seen a server much less racked a datacenter floor up, or understand assembly or C. This is concerning if academia is mostly producing CS students narrowly focused on web FE/BE.


I feel like Python is too half baked to be a really good language to teach fundamentals. By "half baked" I mean - sort of supports functional programming, but not really. Sort of supports object oriented design, but not really. Sort of allows type system theory to be taught, but not really. So many sort ofs. It's a great language if you just want to get someone cranking some code. But if I wanted a language to be a base from which I could explore all the corners of computing I'd think else might make more sense (and having written that I can't really think of a good one - Scala in theory, but in practice it's awfully complex to get started in).


Wait, don't all these "sort ofs" make it actually an ideal tool for teaching the basics? I mean, we're talking about basics, so what you're saying is that you can teach the basics of a number of different paradigms with one language.


> what you're saying is that you can teach the basics of a number of different paradigms with one language.

I don't know about Python, but that is one of the delights of SICP and scheme in general.

It's not that there isn't a OO implementation in scheme; it's that there are too many. Implementing an OO system is a medium-advanced exercise left to the reader. As a result there are many available, up to the point one wonders if "OO" can be rightfully coined "a paradigm".

Constraint based/logic programming? I think it's a chapter in SICP, and covered by one of the SICP video lectures showing how to build a simple prolog-like language to illustrate some other concept

Type systems? This one is a little funny. There's a SICP lecture on it (the recordings are from the 80s), and I thought it was an unfortunate choice of name, since "type systems" are a real and important concept now in modern computer languages and the naming would clash, but no, it actually was an implementation of a type system as we now use the term.

I think one of the reasons scheme doesn't have "a lot of stuff" is because most stuff, if well understood, is trivial. If it doesn't seem trivial, one simply doesn't have a deep enough understanding of the concepts yet.

One of the things about using scheme effectively as a teaching language is that it puts the wizardry in the actual wizard, not in the magic box with the blinkenlights.


It's very hard to teach the value of some approach if you can only see it implemented half-way.


The first course doesn't need to touch on every programming style. Python is fine for learning imperative basics (preconditions, postconditions of your loops and methods, etc)


F# would be a good candidate. It covers all dominant paradigms without exploding complexity.


Can someone from MIT confirm that only Python is taught in the MIT intro to cs class? We also teach SICP as the intro to cs class at Berkeley (cs61a.org), and we start with Python, but then change to scheme by the latter half of the semester.


AFAIK the switch to Scheme is not complete, no? You're still writing Python for the most part.


I like to imagine Lazy University, where the committee meets and immediately standardizes on somebody's offhand remark that the intro course should use Javascript because it's probably running on every device the students are using.

Or, replace Javascript with any other high level language likely to be running on every smartphone. If there are competing offhanded suggestions then bonus lazy points for using a random number generator to decide the winner.

Only when the committee meeting is over do Lazy U. profs begin planning how to teach their classes. And there's more time for that since the Lazy U. committee meeting was an order of magnitude shorter than most. (Although the laziness would probably be recursive and result in the elimination of the lecture, lab, geographic proximity to students, and assignments that can be distinguished from publicly accessible repositories full of code and research.)


I love Python for a lot of things but I’ve come to realize how complex it can really be.

I regularly see others write Python code that I have to fix to add robustness. I also run "pylint" or equivalent tools before committing major changes to scripts because it is quite easy to make mistakes that won’t otherwise be found right away.

Some of my least favorite pitfalls:

- If you call a function that happens to refer to a variable that only exists in the calling code, it will work (and yes, since the same function can be called from more than one place, “caller” is not always the same so the effect is not always the same). Bonus points if it only matches the caller because of a typo. Any convenience afforded by this behavior is not worth the potential for head-scratching and painful debugging.

- Python has things that “seem” like the obvious/right thing to do when they are not. One great example is "except:", which to this day I have to keep correcting in various scripts to prevent important errors from being completely lost. Another one is expecting to write the entire script at column 0, when people are supposed to know 'if __name__ == "__main__"'. Also, a type’s apparent shared/unique status is not always clear so novices can get a long way with sort-of working code until they are completely confused by the broken cases (e.g. you can simply list and “initialize” class fields with trivial string and number values until one day you add something like a "dict()" and everything breaks for that one field because all your class instances are mysteriously sharing it; suddenly you need to define a whole new "__init__()" to fix things when it wasn’t apparently needed previously).

- It is not always obvious to maintainers of functions that keyword arguments are extremely fragile and that they can overlap with other arguments. If an argument is renamed or relocated for example, stuff elsewhere can break in stupid ways. Tiny code changes can create big headaches.


>If you call a function that happens to refer to a variable that only exists in the calling code, it will work

This is not true. This code gives a NameError instead of printing "1":

  def f1():
      print(var)
  
  def f2():
      var = 1
      f1()
  
  f2()
If it can't find a name in the local scope, it will look in enclosing functions, then the module scope, then the builtins. But it will never look inside the caller's scope unless the function is defined in the caller's scope.

Python's implicit scoping can be confusing, but it's not quite that bad.


> If you call a function that happens to refer to a variable that only exists in the calling code, it will work

Could you give an example of what you mean by this?


Not the OP, but I think what they are referring to is the following:

  def foo():
    return bar
  
  if ...:
    bar = 42
    print(foo())
  else:
    print(foo())
If ... evaluates to a truthy value, bar will be defined and foo() will return 42, if it is falsy, bar will not be defined and the code will throw a NameError.

I would agree that this is extremely confusing behaviour. Of course people wouldn't write code like this (hopefully), but a similar situation might occur in a much more obscure situation.


One of the best electives I took in my CSE program was Art in Code. We covered Processing and Arduino and the only required textbook was Making Things Talk. https://www.amazon.com/Making-Things-Talk-Practical-Connecti...

These tools were designed to be accessible to non-programmers and had our entire cross-disciplinary class making games, interacting with hardware, and getting inspired to find new ways to interact with computers. I wish this class had been required by every first-year CSE student.


Where I went to school, they didn't teach programming at all. First day, first class, if I remember correctly, the teacher walks up in front of the class and says something like, "Computer science is the design and implementation of algorithms..." and finishes with saying they won't teach us any programming. The assignments must be implemented in C and run on campus computers. There are optional, zero credit labs to get help learning to program or get help with homework held by the TA's.

I went to one lab, the lab computers and all campus computers were running Slackware with Afterstep. I was a Window Maker and Red Hat user at the time, I was impressed and thought it was neat.

Of course this is from memory and 18 years ago almost to the day, but that's the gist of it. Anyway, I felt like I learned a ton and loved it every bit. I went to a few of the CS club meetings and got the impression basically every one had already had internships with Intel, IBM, or Microsoft. This was my freshmen year and was basically in awe.


> Then, what generaly happened was a programmer would think for a really long time, and then write just a little bit of code, and in practical terms, programming involved assembling many very small pieces

Oh how I bloody want to do jut this today. I want my program to be split into as small parts as possible, every part (i.e. every function) being defined in a separate file. But Python encourages the direct opposite - writing long files of code to avoid introducing too many "modules" and this is a debilitating headache.

> At some point along the way (he may have referred to the 1990’s specifically), the systems that were being built and the libraries and components that one had available to build systems were so large, that it was impossible for any one programmer to be aware of all of the individual pieces, never mind understand them.

Thanks to my ADHD it is very hard to me to fit anything that doesn't fit in half a screen into my mind. The longer a module growth the lower my productivity falls.


> But Python encourages the direct opposite - writing long files of code to avoid introducing too many "modules" and this is a debilitating headache.

How does Python encourage mega-files? AFAIK there's nothing that prevents nor discourages proper organization of code into files.

I frequently use multiple files to organize my library implementations.


Just some clues: A class definition can not be split in many files (e.g. in C# you can do this easily with "partial class" declaration and that's amazing, needless to say you can and almost always do split one module (a namespace) into many flies there). Whatever you put in a separate file you will have to import manually everywhere you use it and you have to care about circular imports. You have to invent names all the time, a package, a module and a function having the same name smells a headache. The C# namespaces definition system could be made even better than it is already (i.e. by allowing class-free functions directly inside namespaces) but it already feels just so much better than Python "module = file" model.


>A class definition can not be split in many files (e.g. in C# you can do this easily with "partial class" declaration and that's amazing, needless to say you can and almost always do split one module (a namespace) into many flies there).

I think most would consider this a good thing. WYSIWYG. If a class is so gargantuan as to require multiple files, it should probably be multiple classes that communicate via a well defined, public API, not implicitly.

>You have to invent names all the time, a package, a module and a function having the same name smells a headache.

This is only an issue if you are importing *. If you import in a namespace (`from foo import module`, `from bar import other_module`), you can have `module.module`, `other_module.module` and anything else all living in harmony. If you work in python as python intends and not C/C++ (namespaced imports, not text-prepended ones), it's much, much cleaner.

It's also possible to, generally speaking, have private submodules within one larger module. You can do this with `__init__.py` by importing the various submodules, explicitly re-exporting the things you wish to be top-level public, and setting `__all__` to include them. This is why I can do something like `import numpy as np; np.ndarray` even though ndarray is defined in an extension module referenced by `numpy.core.multiarray`.

See multiarray (https://github.com/numpy/numpy/blob/master/numpy/core/multia...), which pulls things from `numpy.core._multiarray_umath`, then exported by core via `numeric` (https://github.com/numpy/numpy/blob/master/numpy/core/__init...), then again by the top level __init__ (https://github.com/numpy/numpy/blob/master/numpy/__init__.py...).


> If a class is so gargantuan as to require multiple files

IMHO anything that doesn't fit in one screen is "gargantuan"

> If you import in a namespace (`from foo import module`, `from bar import other_module`), you can have `module.module`, `other_module.module`

Which looks ugly and feels a nasty headache if you want to maintain intuitive vision of your code and dependencies structure.

> It's also possible to, generally speaking, have private submodules within one larger module. You can do this with `__init__.py` by importing the various submodules, explicitly re-exporting

I know but even reading this paragraph hurts. Too much mess to manage manually.


>IMHO anything that doesn't fit in one screen is "gargantuan"

This is a bit of an odd definition, but sure (I've seen C++ files whose imports alone were gargantuan under your definition). I just looked through a production python application, and it there were 2 non-test classes that were more than 100 lines long. Both were relatively verbose (well commented, use type hints so argument lists use significant vertical space, etc.)

In general, it seems like you're limiting yourself to, with the necessary boilerplate, one or two functions in any file that isn't "gargantuan". This makes it needlessly difficult to understand the structure of your applications since you are forced to split related logic up among multiple files. This, more than screen length or the import syntax, makes it much, much harder to "maintain intuitive vision of code and dependencies structure" as you say.

>Which looks ugly and feels a nasty headache if you want to maintain intuitive vision of your code and dependencies structure.

Not at all. It's more explicit about the dependency structure (`module.method(args)` at a call site gives you a much better idea of the structure than just `method(args)`), and so makes it significantly easier to maintain an intuitive vision of the code and dependencies.

It's incalculably easier to understand dependency structure when using the `import module` syntax than `from module import Class` or `from module import *`. So much so that the former allows reliable, large scale refactorings without any runtime information. The others, as you correctly recognize, do not.

>I know but even reading this paragraph hurts. Too much mess to manage manually.

Then don't! You don't need to. It's really only necessary for truly public/widely used APIs (like numpy) where understanding the internal structure of the module is not worth it for the average user.


> Then don't! You don't need to. It's really only necessary for truly public/widely used APIs (like numpy) where understanding the internal structure of the module is not worth it for the average user.

I mean for the library developer, not for the user.


I contend that within reason, it really doesn't matter exactly which language happens to be one's very first programming language. Programming is ideally motivated by passion, and in that case, a person is certainly not going to give up and change career just because their intro language was a bit inconsistent or rocky. Anyone who gives up programming before at least being fluent in three languages, is giving up prematurely.

This doesn't mean assembler or brainfuck is okay as a first language (although the former was the 2nd langauge I taught myself and its reputation for difficulty is vastly exaggerated in my opinion), but it does mean that endless discussions about whether Python is better than Javascript or C is better than Lisp is probably splitting hairs.


I couldn’t disagree more.

(1) Learning Java before you learn a procedural language, to me, makes no sense, and turns a programmer’s formative foundation of programming into an object-only one.

(2) The difficulty in getting over the ‘hump’ (usually about 5-6 weeks in to learning one’s first language) is what turned me off programming when I was 13, but given a great intro course I was able to get over it and I’ve developed a deep passion for it and I couldn’t imagine life without it. So to put the bar at 3 languages, to me, is insane.

(3) Many (most?) people don’t start off saying ‘Programming is or will be my career’. Instead, you try it out and see if you can do it and if you enjoy it. To me, this is a fundamental misunderstanding in how people view programming when they are learning their first language.


I think it's very common in discussions like this for people to conflate 'Python vs Scheme' with from-first-principles (lambdas, explicit recursion) vs building-real-things as pedagogical approaches. In a lot of people's minds, Python is its standard library and Scheme is explicitly cdring down lists.

I think the problem is that SICP's pedagogical approach (which people confuse with the language Scheme) is, in a way, pretty close to starting with assembler. You spend a lot of time exploring computation (with lambdas as your basis) at a very low level before you graduate to anything that feels "real" or practical.


I agree. Learning python requires learning the various data structures that are built in to the language (list, dict, set, classes) and learning the various control structures (if, for, while, etc). These are initially black boxes that a student has to take for granted. On the other end of the pedological spectrum you can start with how to define an abstract data structure and function application as a rewrite system then build up the rest of the lambguage from scratch.


> it really doesn't matter exactly which language happens to be one's very first programming language.

I'm not so sure.

Here's an example of blinking an led with Lisp:

http://www.technoblogy.com/show?1GX1

And here's one using python:

https://www.modmypi.com/blog/gpio-and-python-39-blinking-led

If I were my 12 year old me, I would probably enjoy the second more than the first.


I'm sorry, but this is wrong. Most people aren't born with a passion for anything. Optimizing for people who are "passionate" excludes a vast set of people who have never been exposed to computer science or programming, but could become interested if taught in an approachable manner.


Assembler is arguably a better intro language than most. It makes it explicitly clear what each step of the programming does and how it executes. Once students have done a few projects the they'll be ready to appreciate a higher level language like C.

Throwing objects and lambdas and type inference at kids without giving them any grounding in what a computer actually is and does is unfair, and leaves them building castles in the sky.


I can't tell if this was meant to be serious of facetious, but I took this route. I learned 8088 assembly to make an ancient robot platform move, and then learned C a few months later.

Assembly had a few weird architecturey things that you had to understand before things made any sense (e.g. registers, segment:offset addressing, stack pointers, calling conventions), but beyond that, all of the tasks were simple and self-contained.

C was a breath of fresh air after that. Things like pointers, which have a reputation for being confusing, make perfect sense coming from assembly.


> Things like pointers, which have a reputation for being confusing, make perfect sense coming from assembly

Agreed with the caveat that C pointers aren't just memory addresses they just behave like that 99% of the time which makes it especially confusing for the 1% where the optimiser can "play tricks"..


Serious. I'm not saying you should stick with assembler for any great length of time, but understanding what instructions and registers and memory are is pretty important later on. In university I had fellow students who struggled with pointers because even at the end of first year they didn't really understand what memory was or how it worked.



By that logic, you'd have to start with vhdl. Which is a great place to start if you want to grok what a CPU actually is, what pipelining means, and how to write truly parallel code. Except that most people never need to know. That's progress: you can build most useful applications without ever having to learn that there is a distinction between virtual and physical memory.


MIT does have a required course 6.004 that starts you off computing doped semiconducter behavior, building circuits using a simple HDL (JSim when I did it, seems to be something else now). You build combinatorial logic, sequential logic, an ALU, your own CPU implementing a RISC machine code spec, making sure your circuit-simulated CPU can run basic machine code programs, optimizing the machine code programs running on your home-grown CPU.

In the end you're spending all day/week/month optimizing your CPU by stripping out gates, pipelining, all at a circuit/gate/HDL level, with your final project score computed numerically from how fast your optimized machine code program is running on your optimized home-grown CPU.

Probably one of the best classes I took when I was there. Doesn't really matter whether or not it was the first class or not I don't think.


I think it's more like saying that to design CPUs, you should start by understanding what a transistor is.


> within reason, it really doesn't matter exactly which language happens to be one's very first programming language

I agree with this, but I'm not sure how many of the students taking the MIT course in question are learning their very first programming language there.

Now if we were talking about a secondary school course...


Bullshit translator: mean intelligence of newcomers becomes so low that we have to lower our programs or only veeeeery few can attend them. And we need more students so more money. Disagree? Say thanks to all school reforms in the past 30-40-50 years.

A bit of extras: in today world we can't produce really anything new due to managerial-driven society and beside that it's better people do NOT really understand how things work to avoid not only embarrassments but also potential idea to radically wipe out certain kind of business models. So better saying: you can't understand the big picture! Keep play with thing Bigs&powerful give you, they came directly from the nature do not worry about it.

Sorry to have made few "potential-flamey comments" in a row but reasons written in the article have really no other meaning to my eyes. That's not intended to be political or flamebait, it's simply what I see. I do NOT start with scheme, I discovered it years after and I even do not love it too much due to it's n-th implementation incompatibilities, srfi and "sparse" documentation... But when I discover lisp-like languages in general I see a completely different world and really change my mind. Guile scheme, even with the lacks of libraries etc it's now one of my preferred way to express concepts.

Also as per explanation I can't accept as engineer and even as a citizen to make thing, under my responsibility, with my signature etc that I do not really understand. This kind of idea is simply the exact opposite of engineering.


> mean intelligence of newcomers becomes so low that we have to lower our programs or only veeeeery few can attend them. And we need more students so more money. Disagree? Say thanks to all school reforms in the past 30-40-50 years.

uh, you know this is about MIT right? A school that has seen its acceptance rate dwindle to nearly 7%?


Yep, and I also see few of their on-line filmed lectures and form my present opinion...

I'm European so I came from a really different (older/less reformed) school system and I'm old enough to see how it's change. I also casually interact with USA people and my conclusion is simple: having really poor schools "because there is college after" it can't really work. I see in EU, reform after reform, how unprepared become high-school students and how hard, if not impossible, universities try to "recover" them changing their programs. In the USA this phenomenon it's extreme so even with a super-low acceptance rate results have to be also extreme.

You simply can't get smart but ignorant people and in few years transform them in high-skilled and acculturated guys. Being smart help of course, but learn require time. The better you start, early, the better you'll end up after.


>Bullshit translator: mean intelligence of newcomers becomes so low that we have to lower our programs or only veeeeery few can attend them.

In this case, I think the areas in which programming and CS are applicable has increased so greatly it's now more feasible to have them start from a more general abstracted area and chose their focus as they grow - learning the tools and theory they need for their chosen focus.

If they start getting bogged down by more complex languages it prevents them from identifying after a year or two which areas they are interested in pursuing and focusing on.


Hem, Python is more complex than Scheme... And even if it sound far easier to start, especially for people that already know some imperative language, you need to understand a lot more things for a proper start. And after you have to learn more things that are peculiar to Pyhton, not programming in general...


The article mentions Prof. Sussman in "the great macro debate".

This reminded me of a phrasing I once read, but never quite understood, in the "acknowledgement" section of the R4RS Macro appendix:

"Hal Abelson contributed by holding this report hostage to the appendix on macros".

Can anybody who was/is closer to the subject matters explain what this entails?



basically instead of building your own pipelines and valves and pumps and reservoirs, etc. from scratch, today those thing are already available and proven to be working.

so you only need to learn to be a plumber.


Better question is why they'd use the completely impractical scheme in the first place.


Because it was intro to computer science, not intro to computer practicalities


Python makes sense for engineering, like building robots, where practicality is the most important goal.

However, Python is a lousy choice for computer science, where clarity and correctness are much more important. It saddens me that an elite institution like MIT apparently can't see the difference.


To be fair, I think many people expect that a BS degree in either computer science or engineering would make them employable.


Sure, but the emphasis is different. Someone with a CS degree should understand both theory and practice. Someone with an engineering degree doesn't need much theory.

FWIW, I have a CS degree, and my job is mostly engineering, but using a sound theoretical approach to problems has saved me many times. On the other hand, I've worked with several good programmers over the years who didn't have a college degree at all.


About 15 years ago companies stopped hiring people that could grow into a space, and started hiring people that could immediately work in one.


I just wish there was a simple way to serve web pages in Python like PHP, without having to learn a whole framework/stack/templating language.


Flask is not bad.


For it is intro??!

Edit: Never mind, its good now. ;)


Perl or bust.


I had to learn scheme in uni, everyone hated it. It scared me away from functionnal programming languages for years until I understood that parenthesis are not necessarily tied to FL and cool FL exist like erlang or ocaml.


I think it's because if,then,else in Python is less verbose than LISP family language.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: